Ankit Bansal
Ankit Bansal
  • 189
  • 4 295 441
SQL Test Based on Real Interview | SQL Interview Questions and Answers
In this video we will solve a complete SQL test consist on 7 interview questions. This would be a great exercise to practice SQL.
Start your data analytics journey: www.namastesql.com/
script:
CREATE TABLE users (
USER_ID INT PRIMARY KEY,
USER_NAME VARCHAR(20) NOT NULL,
USER_STATUS VARCHAR(20) NOT NULL
);
CREATE TABLE logins (
USER_ID INT,
LOGIN_TIMESTAMP DATETIME NOT NULL,
SESSION_ID INT PRIMARY KEY,
SESSION_SCORE INT,
FOREIGN KEY (USER_ID) REFERENCES USERS(USER_ID)
);
-- Users Table
INSERT INTO USERS VALUES (1, 'Alice', 'Active');
INSERT INTO USERS VALUES (2, 'Bob', 'Inactive');
INSERT INTO USERS VALUES (3, 'Charlie', 'Active');
INSERT INTO USERS VALUES (4, 'David', 'Active');
INSERT INTO USERS VALUES (5, 'Eve', 'Inactive');
INSERT INTO USERS VALUES (6, 'Frank', 'Active');
INSERT INTO USERS VALUES (7, 'Grace', 'Inactive');
INSERT INTO USERS VALUES (8, 'Heidi', 'Active');
INSERT INTO USERS VALUES (9, 'Ivan', 'Inactive');
INSERT INTO USERS VALUES (10, 'Judy', 'Active');
-- Logins Table
INSERT INTO LOGINS VALUES (1, '2023-07-15 09:30:00', 1001, 85);
INSERT INTO LOGINS VALUES (2, '2023-07-22 10:00:00', 1002, 90);
INSERT INTO LOGINS VALUES (3, '2023-08-10 11:15:00', 1003, 75);
INSERT INTO LOGINS VALUES (4, '2023-08-20 14:00:00', 1004, 88);
INSERT INTO LOGINS VALUES (5, '2023-09-05 16:45:00', 1005, 82);
INSERT INTO LOGINS VALUES (6, '2023-10-12 08:30:00', 1006, 77);
INSERT INTO LOGINS VALUES (7, '2023-11-18 09:00:00', 1007, 81);
INSERT INTO LOGINS VALUES (8, '2023-12-01 10:30:00', 1008, 84);
INSERT INTO LOGINS VALUES (9, '2023-12-15 13:15:00', 1009, 79);
-- 2024 Q1
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (1, '2024-01-10 07:45:00', 1011, 86);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (2, '2024-01-25 09:30:00', 1012, 89);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (3, '2024-02-05 11:00:00', 1013, 78);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (4, '2024-03-01 14:30:00', 1014, 91);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (5, '2024-03-15 16:00:00', 1015, 83);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (6, '2024-04-12 08:00:00', 1016, 80);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (7, '2024-05-18 09:15:00', 1017, 82);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (8, '2024-05-28 10:45:00', 1018, 87);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (9, '2024-06-15 13:30:00', 1019, 76);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (10, '2024-06-25 15:00:00', 1010, 92);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (10, '2024-06-26 15:45:00', 1020, 93);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (10, '2024-06-27 15:00:00', 1021, 92);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (10, '2024-06-28 15:45:00', 1022, 93);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (1, '2024-01-10 07:45:00', 1101, 86);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (3, '2024-01-25 09:30:00', 1102, 89);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (5, '2024-01-15 11:00:00', 1103, 78);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (2, '2023-11-10 07:45:00', 1201, 82);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (4, '2023-11-25 09:30:00', 1202, 84);
INSERT INTO LOGINS (USER_ID, LOGIN_TIMESTAMP, SESSION_ID, SESSION_SCORE) VALUES (6, '2023-11-15 11:00:00', 1203, 80);
Zero to hero(Advance) SQL Aggregation:
ua-cam.com/video/5Ighj_2PGV0/v-deo.html
Most Asked Join Based Interview Question:
ua-cam.com/video/xR87ctOgpAE/v-deo.html
Solving 4 Trick SQL problems:
ua-cam.com/video/Ck1gQrlS5pQ/v-deo.html
Data Analyst Spotify Case Study:
ua-cam.com/video/-YdAIMjHZrM/v-deo.html
Top 10 SQL interview Questions:
ua-cam.com/video/Iv9qBz-cyVA/v-deo.html
Interview Question based on FULL OUTER JOIN:
ua-cam.com/video/KQfWd6V3IB8/v-deo.html
Playlist to master SQL :
ua-cam.com/play/PLBTZqjSKn0IeKBQDjLmzisazhqQy4iGkb.html
Rank, Dense_Rank and Row_Number:
ua-cam.com/video/xMWEVFC4FOk/v-deo.html
#sql #dataengineer
Переглядів: 9 792

Відео

Difference Between SQL Views vs Materialized Views | Frequently Asked SQL Interview Question
Переглядів 7 тис.14 днів тому
In this video we will understand SQL views and difference Between SQL Views vs Materialized Views. This is a very frequently asked SQL interview question. This is the only video you need to answer this question in your next SQL interview. script: CREATE TABLE orders ( order_id INT, order_date DATE, product_name VARCHAR(20), sales INT ); INSERT INTO orders (order_id, order_date, product_name, sa...
Fractal Analytics SQL Interview Question (Game of Thrones Database) | SQL for Data Engineer
Переглядів 17 тис.14 днів тому
In this video we will discuss an advanced interview question on SQL asked in a Data Engineer interview based on Game of Thrones Database. Kickoff your data analytics journey : www.namastesql.com/ Script: Create the 'king' table CREATE TABLE king ( k_no INT PRIMARY KEY, king VARCHAR(50), house VARCHAR(50) ); Create the 'battle' table CREATE TABLE battle ( battle_number INT PRIMARY KEY, name VARC...
Accenture SQL Interview Question | Database Case Sensitivity vs Insensitivity
Переглядів 18 тис.28 днів тому
In this video we will discuss a Accenture SQL interview question and solve it when the data is case sensitive and when it is insensitivity . Here is the script: CREATE TABLE employees (employee_id int,employee_name varchar(15), email_id varchar(15) ); delete from employees; INSERT INTO employees (employee_id,employee_name, email_id) VALUES ('101','Liam Alton', 'li.al@abc.com'); INSERT INTO empl...
BCA to Data Analyst at Big Tech Product Based Company Ex-Amazon, PhonePe, Noon
Переглядів 4,2 тис.Місяць тому
In this podcast with Vishal Sahu , We will discuss how he moved from BCA to data analytics at a ride sharing company and worked with big companies like Amazon, Noon, PhonePe etc. We discussed about following points : 1- Salaries in analytics domain 2- What skills are needed to move into data analytics 3- What triggered him to move to data analytics from L2 at Amazon. 4- How it feels to work in ...
15 Days of Learning SQL | Advanced SQL for Data Analytics
Переглядів 19 тис.Місяць тому
15 days of learning SQL In this video we will deep dive on a problem called "15 days of learning SQL" from hacker rank. This is a challenging SQL problem to solve. Here is the script: CREATE TABLE Submissions ( submission_date DATE, submission_id INT PRIMARY KEY, hacker_id INT, score INT ); INSERT INTO Submissions (submission_date, submission_id, hacker_id, score) VALUES ('2016-03-01', 8494, 20...
Netflix Data Cleaning and Analysis Project | End to End Data Engineering Project (SQL + Python)
Переглядів 27 тис.Місяць тому
In this video we will implement an end to end ELT project. ELT stands for Extract, Load and Transform . We will use Netflix dataset to clean and analyze the data using SQL and Python. LinkedIn: www.linkedin.com/in/ankitbansal6/ High quality Data Analytics affordable courses: www.namastesql.com/ End to End ETL project : ua-cam.com/video/uL0-6kfiH3g/v-deo.html Netflix dataset: www.kaggle.com/data...
Meesho SQL Interview Question for Data Analysts | SQL Non-Equi Join | Aam vs Mentos Zindagi
Переглядів 12 тис.Місяць тому
Meesho SQL Interview Question for Data Analysts | SQL Non-Equi Join | Aam vs Mentos Zindagi
3 Solutions to a ITC Infotech SQL Interview Question
Переглядів 9 тис.Місяць тому
In this video we will solve a ITC Infotech SQL interview question using 3 solutions. here is the script: CREATE TABLE city_distance ( distance INT, source VARCHAR(512), destination VARCHAR(512) ); delete from city_distance; INSERT INTO city_distance(distance, source, destination) VALUES ('100', 'New Delhi', 'Panipat'); INSERT INTO city_distance(distance, source, destination) VALUES ('200', 'Amb...
Angel One Easy-Peasy SQL Interview Question for a Data Science Position
Переглядів 11 тис.Місяць тому
In this video we will discuss a Angle One SQL interview question asked for a data science position. Kickoff your data analytics journey: www.namastesql.com/ Script: CREATE TABLE tickets ( airline_number VARCHAR(10), origin VARCHAR(3), destination VARCHAR(3), oneway_round CHAR(1), ticket_count INT ); INSERT INTO tickets (airline_number, origin, destination, oneway_round, ticket_count) VALUES ('D...
Swiggy Data Analyst SQL Interview Question and Answer
Переглядів 13 тис.Місяць тому
In this video we will discuss a sql interview question asked in Swiggy for a Data Analyst Position. Here is the script: Create the table CREATE TABLE stock ( supplier_id INT, product_id INT, stock_quantity INT, record_date DATE ); Insert the data delete from stock; INSERT INTO stock (supplier_id, product_id, stock_quantity, record_date) VALUES (1, 1, 60, '2022-01-01'), (1, 1, 40, '2022-01-02'),...
IBM Data Engineer SQL Interview Question (Hacker Rank Online Test)
Переглядів 17 тис.2 місяці тому
In this video we are going to discuss a SQL interview question asked in IBM for a data engineer position. It was part of a hacker rank test. We are also going to tweak the question a bit and try to solve it. Kick off Your Data Analytics Journey: www.namastesql.com/ script: CREATE TABLE FAMILIES ( ID VARCHAR(50), NAME VARCHAR(50), FAMILY_SIZE INT ); Insert data into FAMILIES table INSERT INTO FA...
Honeywell SQL Interview Question | Print Movie Stars (⭐ ⭐ ⭐ ⭐⭐) For best movie in each Genre
Переглядів 11 тис.2 місяці тому
In this video we will discuss solution of a SQL interview problem asked in Honeywell interview. Here is the script: CREATE TABLE movies ( id INT PRIMARY KEY, genre VARCHAR(50), title VARCHAR(100) ); Create reviews table CREATE TABLE reviews ( movie_id INT, rating DECIMAL(3,1), FOREIGN KEY (movie_id) REFERENCES movies(id) ); Insert sample data into movies table INSERT INTO movies (id, genre, tit...
Cracked Myntra as Data Analyst with 1 Year Experience
Переглядів 15 тис.2 місяці тому
This podcast is with Saurabh who recently cracked data analyst interview for Myntra . We will discuss following points : 1- How to get calls from product based companies. 2- Resources he used for preparations. 3- Interview questions 4- Resume preparation 5- What else apart from resume Zero to hero(Advance) SQL Aggregation: ua-cam.com/video/5Ighj_2PGV0/v-deo.html Most Asked Join Based Interview ...
End to End Data Analytics Project (Python + SQL)
Переглядів 97 тис.2 місяці тому
In this video we will do an end to end data analytics project using python and SQL. We will use Kaggle API to download the dataset and to data processing and cleaning using pandas and load the data into sql server. Lastly we will answer some interesting questions using SQL. github link: github.com/ankitbansal6/data_analytics_project/ Data Analytics high quality content: www.namastesql.com/ Zero...
SCD Type 1 and Type 2 using SQL | Implementation of Slowly Changing Dimensions
Переглядів 10 тис.2 місяці тому
SCD Type 1 and Type 2 using SQL | Implementation of Slowly Changing Dimensions
SQL Merge Statement Tutorial A-Z | How to use Merge in SQL step by step
Переглядів 10 тис.2 місяці тому
SQL Merge Statement Tutorial A-Z | How to use Merge in SQL step by step
SQL QUALIFY Keyword | Reduce Your Sub Queries and CTEs
Переглядів 8 тис.2 місяці тому
SQL QUALIFY Keyword | Reduce Your Sub Queries and CTEs
Top 5 Advanced SQL Interview Questions and Answers | Frequently Asked SQL interview questions
Переглядів 36 тис.3 місяці тому
Top 5 Advanced SQL Interview Questions and Answers | Frequently Asked SQL interview questions
Tricky SQL Interview Question by a Product Based Company | Ludo King SQL Analytics
Переглядів 7 тис.3 місяці тому
Tricky SQL Interview Question by a Product Based Company | Ludo King SQL Analytics
Famous SQL Interview Question | First Name , Middle Name and Last Name of a Customer
Переглядів 23 тис.3 місяці тому
Famous SQL Interview Question | First Name , Middle Name and Last Name of a Customer
Rollup, Cube and Grouping Sets in SQL | Advanced SQL Tutorial | Beyond Group By
Переглядів 15 тис.3 місяці тому
Rollup, Cube and Grouping Sets in SQL | Advanced SQL Tutorial | Beyond Group By
SQL For Business Intelligence | YoY, QoQ, MTD, YTD, QTD etc. in a Single SQL
Переглядів 15 тис.3 місяці тому
SQL For Business Intelligence | YoY, QoQ, MTD, YTD, QTD etc. in a Single SQL
Building Calendar Dimension Table from Scratch with SQL | SQL For Analytics
Переглядів 15 тис.4 місяці тому
Building Calendar Dimension Table from Scratch with SQL | SQL For Analytics
Launching Live Bootcamp of SQL for Data Analytics 0 to Hero | Starting March 2nd 2024
Переглядів 9 тис.4 місяці тому
Launching Live Bootcamp of SQL for Data Analytics 0 to Hero | Starting March 2nd 2024
Why I left Amazon.
Переглядів 12 тис.4 місяці тому
Why I left Amazon.
Superstore Data Analysis | End to End AWS Data Engineering Project for Beginners
Переглядів 23 тис.4 місяці тому
Superstore Data Analysis | End to End AWS Data Engineering Project for Beginners
Freshworks Data Analyst SQL Interview Problem | SQL For Data Analytics
Переглядів 22 тис.4 місяці тому
Freshworks Data Analyst SQL Interview Problem | SQL For Data Analytics
Solving a Leetcode DSA Problem with SQL | Advanced SQL for Analytics | Aam vs Mentos Life
Переглядів 7 тис.4 місяці тому
Solving a Leetcode DSA Problem with SQL | Advanced SQL for Analytics | Aam vs Mentos Life
Amazon Python Interview Question for a Data Engineer L4 Position | Python For Data Analytics
Переглядів 15 тис.5 місяців тому
Amazon Python Interview Question for a Data Engineer L4 Position | Python For Data Analytics

КОМЕНТАРІ

  • @sidindian1982
    @sidindian1982 20 хвилин тому

    Brilliantly Explained sir .... Mindblowing ... 🙂

  • @user-ex6vq8ft8l
    @user-ex6vq8ft8l 2 години тому

    I am having trouble in connecting with the MySQL server with the given code using sqlalchemy. is there any another way to connect with the MySQL server?

  • @shawnsmothers1715
    @shawnsmothers1715 3 години тому

    I think this i great but i do think its more advance than most will ask. I cant see many companies asking these questions to juniors. One because they are pretty complex and two because of time.

  • @NishadDhamne
    @NishadDhamne 5 годин тому

    select a.team_name,(COALESCE(b.wins,0) +COALESCE(d.lost,0)) as No_Of_matches, COALESCE(b.wins,0) no_of_WINS, COALESCE(d.Lost,0) No_of_Losses from (Select Team_1 as team_name from nishaddb.dbo.icc_world_cup union Select Team_2 from nishaddb.dbo.icc_world_cup )a left join (select winner, count(winner) wins from nishaddb.dbo.icc_world_cup group by winner)b on a.team_name =b.winner left join ( select c.team, sum(c.lost1) Lost from (select Team_1 as team, count(team_1) as lost1 from nishaddb.dbo.icc_world_cup where team_1 <> winner group by team_1 union all select Team_2, count(team_2) as lost1 from nishaddb.dbo.icc_world_cup where team_2 <> winner group by team_2) c group by c.Team )d on a.team_name =d.team

  • @PowerBiLearning-xh8wk
    @PowerBiLearning-xh8wk 6 годин тому

    i was getting this error with datetrunc function 'DATETRUNC' is not a recognized built-in function name.'

  • @mr.pingpong502
    @mr.pingpong502 6 годин тому

    with cte as ( select *,'Source' as Location from source where id not in (select a.id from source a inner join target b on a.id=b.id and a.name=b.name) union all select *,'target' as Location from target where id not in (select a.id from source a inner join target b on a.id=b.id and a.name=b.name) ) select distinct a.id,case when occurance>1 then 'Mismatch' when lower(Location)='source' then 'New in Source' when lower(Location)='target' then 'New in Source' else null end as comments from cte a inner join (select id,count(id) as occurance from cte group by id )b on a.id=b.id

  • @Chathur732
    @Chathur732 7 годин тому

    with cte as ( SELECT *, LEAD(date_sub(bill_date,interval 1 day), 1, '9999-12-31') OVER (PARTITION BY emp_name ORDER BY bill_date) AS bill_date_end FROM billings) select cte.emp_name, sum(bill_rate*bill_hrs) as Total_amount from cte join hoursworked on cte.emp_name = hoursworked.emp_name and hoursworked.work_date between cte.bill_date and bill_date_end group by cte.emp_name

  • @amrutapatil6491
    @amrutapatil6491 7 годин тому

    SELECT A.*, DATEDIFF(DAY, A.CREATE_DATE, A.RESOLVED_DATE) AS ACTUAL_DATE, DATEDIFF(DAY, A.CREATE_DATE, A.RESOLVED_DATE) - 2 * DATEDIFF(WEEK, A.CREATE_DATE, A.RESOLVED_DATE) - A.NO_OF_HOLIDAYS AS ACTUAL_WORKING_DAYS FROM ( SELECT TICKET_ID, CREATE_DATE, RESOLVED_DATE, COUNT(HOLIDAY_DATE) AS NO_OF_HOLIDAYS, REASON FROM TICKETS T LEFT JOIN HOLIDAYS H ON H.HOLIDAY_DATE BETWEEN T.CREATE_DATE AND T.RESOLVED_DATE AND DATENAME(WEEKDAY, H.HOLIDAY_DATE) NOT IN ('Saturday', 'Sunday') GROUP BY TICKET_ID, CREATE_DATE, RESOLVED_DATE, REASON ) A;

  • @praveensundarsundar6007
    @praveensundarsundar6007 8 годин тому

    Someone confirm whether this logic works? First we find count of new customers everyday and then find active customers count similarly. Now by subtracting them we'll get the repeat or continuing customers.

  • @Chathur732
    @Chathur732 8 годин тому

    select username, activity, startdate, enddate from ( select *, count(1) over(partition by username) as count_username, rank() over(partition by username order by startdate, enddate) as ranking from useractivity) a where (ranking = 2 ) or (count_username < 2)

  • @AbhishekSharma-vm7tr
    @AbhishekSharma-vm7tr 9 годин тому

    1:03 sir you said rolling threesome and that was very funny by the way thank you for video

  • @Chathur732
    @Chathur732 10 годин тому

    part 1:Customer Retention with cte as( SELECT *, month(order_date) as month, lag(order_date) over (Partition by cust_id order by order_date) as previous_order_date from transactions ) SELECT month(order_date) as month_1, sum(case when previous_order_date then 1 else 0 end) as count from cte group by month_1 part 2: Customer Churn SELECT *, month(order_date) as month, lead(order_date) over (Partition by cust_id order by order_date) as next_order_date from transactions ) SELECT month(order_date) as month_1, sum(case when next_order_date is null then 1 else 0 end) as count from cte group by month_1 happy SQL coding!!

  • @susmitakundu6120
    @susmitakundu6120 10 годин тому

    Thanks Ankit for your guidance. Please have a look below query select sum(case when tc> 1 then 1 else 0 end )as repeat_customer, sum(case when tc= 1 then 1 else 0 end )as new_customer from (select customer_id, count( customer_id) as tc from customer_orders group by customer_id) a ;

  • @RTX7050
    @RTX7050 11 годин тому

    If you're facing issues with the creation of engine then you can use this part which is easy to modify: import sqlalchemy as sa import pyodbc import urllib params = urllib.parse.quote_plus( "DRIVER={ODBC Driver 17 for SQL Server};" "SERVER=DEMO\SQL2022;" "DATABASE=Python;" # "UID=domain\username;" # "PWD=*******" "Trusted_Connection=yes" ) engine = sa.create_engine(f"mssql+pyodbc:///?odbc_connect={params}") conn = engine.connect()

  • @gameply347
    @gameply347 12 годин тому

    Amazing video got to learn so many things <3.

  • @abdulashwaq7312
    @abdulashwaq7312 14 годин тому

    I have query base city like, i need to check piiza send across city like bangalore, mumbai , delhi etc, pls help how write this code using slq table called pizza.

    • @ankitbansal6
      @ankitbansal6 8 годин тому

      Share data and expected output

  • @Batmanspidermanironman
    @Batmanspidermanironman 15 годин тому

    Hi Ankit, Very well explained! Can we use dense_rank here instead of rank()?

  • @shivammishra-mk9jp
    @shivammishra-mk9jp 15 годин тому

    I saw multiple videos on this topic, but trust me guy's no one can simplify the lang. or concept other than Mr. Ankit Bansal. Respect you sir for a reason 🙏🙏 thanks a lot❤

  • @vikhyatjalota2213
    @vikhyatjalota2213 18 годин тому

    with cte_1 as ( select *,date_sub(date_value, interval (rnk) day ) as orignal_date from( select *,row_number() over(partition by state order by date_value ) as rnk from tasks order by 1) new) select min(date_value) as start_date, max(date_value) as end_date, state from cte_1 group by orignal_date,state

  • @vikhyatjalota2213
    @vikhyatjalota2213 19 годин тому

    MySQL sol: with second as( select seller_id,order_date,item_id,item_brand from ( select seller_id,order_date,item_id,item_brand, dense_rank() over(partition by seller_id order by order_date asc) as rnk from orders join items using(item_id) order by 1) new where rnk =2) select user_id as seller_id, case when item_brand = favorite_brand then 'yes' else 'no' end as 2nd_item_fav_brand from users u left join second s on u.user_id = s.seller_id

  • @ss-hm6ey
    @ss-hm6ey 19 годин тому

    Please crct me if i am wrong... My query for findimg users who bought only iphone 15 : With cte as ( select user_id from iphone Group by user_id Having count(*)<2) Select user_id, model from iphone Where user_id in (select user_id from cte) and model="i-15"

  • @himanshubasra4451
    @himanshubasra4451 День тому

    Here is one more approach : select e1.emp_id, e1.name, e1.salary,e2.salary, e1.dept_id from emp_salary as e1 join emp_salary e2 on e1.dept_id=e2.dept_id where e1.salary=e2.salary and e1.emp_id != e2.emp_id I think it is easier too. Do let me know. Thanks

  • @UjjwalSinghPal-um3pb
    @UjjwalSinghPal-um3pb День тому

    if i started learning for now than there are very less data engineering internships(if one has data internship it is easy to get a job ,and that person also get hand on experience ) ,,,,so should i do data analyst first only for internship and after that i will study for data engineering for job because it is also a good way , and i need to do an internship because it is in our college criteria , they do not support much if you do not have done internship?? informative video by the way

  • @shashwatdev2371
    @shashwatdev2371 День тому

    My solution for this question- with cte1 as ( select team_1 as team, count(*) as num1 from icc_world_cup group by team_1 union all select team_2 as team, count(*) as num1 from icc_world_cup group by team_2 ), cte2 as ( select team, sum(num1) as Matches_Played from cte1 group by team ), wins as ( select winner as team, count(*) as no_of_wins from icc_world_cup group by winner ) select a.team,a.Matches_Played,coalesce(b.no_of_wins,0) as no_of_wins, a.Matches_Played-coalesce(b.no_of_wins,0) as no_of_loss from cte2 as a left join wins as b on a.team=b.team order by no_of_wins desc;

  • @siddharthchoudhary103
    @siddharthchoudhary103 День тому

    5th question can we also done used left anti right? like where dep.deptid is null?

  • @user-Veeravalli
    @user-Veeravalli День тому

    with cte as (select a.*,b.first_visit, case when ord_dt = first_visit then 1 else 0 end as flag from cust_orders a left join( select cust_id,min(ord_dt) as first_visit from cust_orders group by cust_id) b on a.cust_id = b.cust_id) select ord_dt,sum(flag) as new_cust_cnt,count(flag)-sum(flag) as reptd_cust_cnt from cte group by ord_dt;

  • @ritudahiya6223
    @ritudahiya6223 День тому

    @Ankit Bansal Can you please write the answer for this query.it will take 2 mins A. B 1. 2 2. 2 1. 3 No value 4 Null. Null There are 2 tables .table A have one row where we have blank value(row 4) .please give output for innere join and left join Eagerly waiting for your reply

    • @ritudahiya6223
      @ritudahiya6223 13 годин тому

      @ankitbansal please reply on this..I have an interview tomorrow...

    • @ritudahiya6223
      @ritudahiya6223 12 годин тому

      @ankitbansal I just want to know what will be the output when there is blank row in one table

  • @himanshijoshi1001
    @himanshijoshi1001 День тому

    with first_order as (select customer_id, min(order_date) as first_order_date, sum(order_amount) as sales from customer_orders group by customer_id) select co.order_date, sum(case when co.order_date=fo.first_order_date then 1 else 0 end) as new_customer_cnt, sum(case when co.order_date!=fo.first_order_date then 1 else 0 end) as repeat_customer_cnt, sum(case when co.order_date=fo.first_order_date then sales end) as new_cust_rev, sum(case when co.order_date!=fo.first_order_date then sales end) as repeat_cust_rev from customer_orders co inner join first_order fo on co.customer_id=fo.customer_id group by co.order_date;

  • @vinaykumarreddykonde4065
    @vinaykumarreddykonde4065 День тому

    Thanks for the videos Her is my approach--> select c.*,count(holiday_date) as hd, case when wek>0 then days-wek*2-count(holiday_date) else days end as actua_days from (select ticket_id,create_date,resolved_date, DATEDIFF(day,create_date,resolved_date) as days, DATEDIFF(WEEK,create_date,resolved_date) as wek from tickets) c left join holidays h on holiday_date between create_date and resolved_date group by ticket_id,days,wek,create_date,resolved_date,holiday_date having datename(dw,holiday_date) not in('Sunday','Saturday') or count(holiday_date)=0;

  • @user-Veeravalli
    @user-Veeravalli День тому

    select team1 as Team_Name,k as Matches_Played,coalesce(w,0) as No_of_Wins,k-coalesce(w,0) as No_of_Loses from( select team1,count(team1)as k from( select team1,team2 from cric_table union all select team2,team1 from cric_table )a group by team1 )q left join( select winner,count(winner) w from cric_table group by winner)r on q.team1 = r.winner

  • @aryanbhagat542
    @aryanbhagat542 День тому

    create table purchases( user_id int, product_id int, quantity int, purchase_date datetime ); insert into purchases values(536, 3223, 6, '2022-01-11 12:33:44'); insert into purchases values(827, 3585, 35, '2022-01-20 14:05:26'); insert into purchases values(536, 3223, 5, '2022-03-02 09:33:28'); insert into purchases values(536, 1435, 10, '2022-03-02 08:40:00'); insert into purchases values(827, 2452, 45, '2022-04-09 00:00:00'); select * from purchases;

  • @vijiinfo
    @vijiinfo День тому

    ❤❤❤

  • @manurajbhatt1085
    @manurajbhatt1085 День тому

    declare @x int set @x=4; declare @dat char(10); set @dat= '2024-07-06'; select DateAdd(DAY,(7-day(@dat))+(@x-1)*7 ,@dat); what do u think?

  • @ankush_in_sync5998
    @ankush_in_sync5998 День тому

    Love you sir, love you

  • @mansinayak3360
    @mansinayak3360 День тому

    Hi Ankit, I can't see the Japanese characters in title post changing the dtype to nvarchar it's showing question marks. I've been searching what could be the reason. Need you suggestion to resolve this.

  • @9844468928
    @9844468928 День тому

    Thank you so much Ankit i was really struggling to understanding the concept of CTE you have cleared it with First_visit as ( select customer_id,min(order_date) as first_visit_date from customer_orders group by customer_id) , visit_flag as ( select co.*,fv.first_visit_date, case when co.order_date = fv.first_visit_date then 1 else 0 end as First_visit_flag, case when co.order_date != fv.first_visit_date then 1 else 0 end as Repeated_visit_flag, case when co.order_date = fv.first_visit_date then order_amount else 0 end as New_customer_order_amount, case when co.order_date != fv.first_visit_date then order_amount else 0 end as Repeated_customer_order_amount from customer_orders co inner join First_visit FV on co.customer_id = FV.customer_id --- if we do we will get the each customer firstdate we will get ) select order_date, sum(First_visit_flag) as total_new_customer, sum(Repeated_visit_flag) as total_repeated_customer, Sum(New_customer_order_amount) as New_customer_order_amount, Sum(Repeated_customer_order_amount) as Repeated_customer_order_amount from visit_flag group by order_date ;

  • @shivakumar-ic9dh
    @shivakumar-ic9dh День тому

    Simple and good explanation😊

  • @naveenreddynarala1700
    @naveenreddynarala1700 День тому

    Thanks for sharing Ankit!. Here is my attempt: CREATE TEMPORARY TABLE cricket.Total_Played AS SELECT team, IFNULL(SUM(occur), 0) AS Total_Matches_Played FROM ( SELECT Team_A AS team, COUNT(*) AS occur FROM cricket.matches GROUP BY Team_A UNION ALL SELECT Team_B AS team, COUNT(*) AS occur FROM cricket.matches GROUP BY Team_B ) AS a GROUP BY team; create temporary table cricket.Total_Matches_Won as select Match_Won, count(*) as total_won from cricket.matches group by 1 select team,Total_Matches_Played,ifnull((Total_Matches_Played-total_won),Total_Matches_Played) as Lost_Matches, ifnull(total_won,0) as Won_Matches from naveen.Total_Played a left join cricket.Total_Matches_Won b on a.team=b.Match_Won