✨The STAR method is a powerful technique used to answer behavioral interview questions effectively.
It helps structure responses by focusing on Situation, Task, Action, and Result. For analytics professionals, using the STAR method ensures that you demonstrate your problem-solving abilities, technical skills, and business acumen in a clear and concise way.
Here’s how the STAR method works, tailored for an analytics interview:
📍 1. Situation
Describe the context or challenge you faced. For analysts, this might be related to data challenges, business processes, or system inefficiencies. Be specific about the setting, whether it was a project, a recurring task, or a special initiative.
Example: “At my previous role as a data analyst at XYZ Company, we were experiencing a high churn rate among our subnoscription customers. This was a critical issue because it directly impacted revenue.”*
📍 2. Task
Explain the responsibilities you had or the goals you needed to achieve in that situation. In analytics, this usually revolves around diagnosing the problem, designing experiments, or conducting data analysis.
Example: “I was tasked with identifying the factors contributing to customer churn and providing actionable insights to the marketing team to help them improve retention.”*
📍 3. Action
Detail the specific actions you took to address the problem. Be sure to mention any tools, software, or methodologies you used (e.g., SQL, Python, data #visualization tools, #statistical #models). This is your opportunity to showcase your technical expertise and approach to problem-solving.
Example: “I collected and analyzed customer data using #SQL to extract key trends. I then used #Python for data cleaning and statistical analysis, focusing on engagement metrics, product usage patterns, and customer feedback. I also collaborated with the marketing and product teams to understand business priorities.”*
📍 4. Result
Highlight the outcome of your actions, especially any measurable impact. Quantify your results if possible, as this demonstrates your effectiveness as an analyst. Show how your analysis directly influenced business decisions or outcomes.
Example: “As a result of my analysis, we discovered that customers were disengaging due to a lack of certain product features. My insights led to a targeted marketing campaign and product improvements, reducing churn by 15% over the next quarter.”*
Example STAR Answer for an Analytics Interview Question:
Question: *"Tell me about a time you used data to solve a business problem."*
Answer (STAR format):
🔻*S*: “At my previous company, our sales team was struggling with inconsistent performance, and management wasn’t sure which factors were driving the variance.”
🔻*T*: “I was assigned the task of conducting a detailed analysis to identify key drivers of sales performance and propose data-driven recommendations.”
🔻*A*: “I began by collecting sales data over the past year and segmented it by region, product line, and sales representative. I then used Python for #statistical #analysis and developed a regression model to determine the key factors influencing sales outcomes. I also visualized the data using #Tableau to present the findings to non-technical stakeholders.”
🔻*R*: “The analysis revealed that product mix and regional seasonality were significant contributors to the variability. Based on my findings, the company adjusted their sales strategy, leading to a 20% increase in sales efficiency in the next quarter.”
Hope this helps you 😊
It helps structure responses by focusing on Situation, Task, Action, and Result. For analytics professionals, using the STAR method ensures that you demonstrate your problem-solving abilities, technical skills, and business acumen in a clear and concise way.
Here’s how the STAR method works, tailored for an analytics interview:
📍 1. Situation
Describe the context or challenge you faced. For analysts, this might be related to data challenges, business processes, or system inefficiencies. Be specific about the setting, whether it was a project, a recurring task, or a special initiative.
Example: “At my previous role as a data analyst at XYZ Company, we were experiencing a high churn rate among our subnoscription customers. This was a critical issue because it directly impacted revenue.”*
📍 2. Task
Explain the responsibilities you had or the goals you needed to achieve in that situation. In analytics, this usually revolves around diagnosing the problem, designing experiments, or conducting data analysis.
Example: “I was tasked with identifying the factors contributing to customer churn and providing actionable insights to the marketing team to help them improve retention.”*
📍 3. Action
Detail the specific actions you took to address the problem. Be sure to mention any tools, software, or methodologies you used (e.g., SQL, Python, data #visualization tools, #statistical #models). This is your opportunity to showcase your technical expertise and approach to problem-solving.
Example: “I collected and analyzed customer data using #SQL to extract key trends. I then used #Python for data cleaning and statistical analysis, focusing on engagement metrics, product usage patterns, and customer feedback. I also collaborated with the marketing and product teams to understand business priorities.”*
📍 4. Result
Highlight the outcome of your actions, especially any measurable impact. Quantify your results if possible, as this demonstrates your effectiveness as an analyst. Show how your analysis directly influenced business decisions or outcomes.
Example: “As a result of my analysis, we discovered that customers were disengaging due to a lack of certain product features. My insights led to a targeted marketing campaign and product improvements, reducing churn by 15% over the next quarter.”*
Example STAR Answer for an Analytics Interview Question:
Question: *"Tell me about a time you used data to solve a business problem."*
Answer (STAR format):
🔻*S*: “At my previous company, our sales team was struggling with inconsistent performance, and management wasn’t sure which factors were driving the variance.”
🔻*T*: “I was assigned the task of conducting a detailed analysis to identify key drivers of sales performance and propose data-driven recommendations.”
🔻*A*: “I began by collecting sales data over the past year and segmented it by region, product line, and sales representative. I then used Python for #statistical #analysis and developed a regression model to determine the key factors influencing sales outcomes. I also visualized the data using #Tableau to present the findings to non-technical stakeholders.”
🔻*R*: “The analysis revealed that product mix and regional seasonality were significant contributors to the variability. Based on my findings, the company adjusted their sales strategy, leading to a 20% increase in sales efficiency in the next quarter.”
Hope this helps you 😊
👍5👏1
Infosys is hiring 20,000 freshers in various fields and here is a complete guide to crack this interview
1. Understand the Interview Structure
Infosys fresher recruitment usually has three main stages:
• Aptitude Test (Written Exam)
• Technical Interview
• HR Interview
2. Aptitude Test Preparation
The first stage typically includes questions on logical reasoning, quantitative aptitude, and verbal ability. Prepare the following:
• Quantitative Aptitude: Topics include time & work, percentages, profit & loss, probability, permutations & combinations, and number series.
• Logical Reasoning: Focus on puzzles, blood relations, data interpretation, and syllogisms.
• Verbal Ability: This includes reading comprehension, sentence correction, error spotting, synonyms/antonyms, and fill-in-the-blanks.
Resources:
• Books: RS Aggarwal’s Quantitative Aptitude for quantitative topics.
• Websites: Platforms like IndiaBix or Testbook provide practice questions.
Tips:
• Practice regularly under timed conditions.
• Use mock tests to improve speed and accuracy.
• Focus on weak areas after taking a few practice tests.
3. Technical Interview Preparation
In this round, Infosys assesses your understanding of basic programming, algorithms, data structures, and other core subjects. Here’s how to prepare:
• Programming Languages: Have a solid foundation in at least one programming language (C, C++, Java, Python).
• Data Structures & Algorithms: Study key topics like arrays, linked lists, stacks, queues, trees, and sorting algorithms.
• DBMS, Operating Systems & Networks: Be prepared for basic questions on SQL, normalization, joins, process management, and networking protocols.
Sample Questions:
• How would you reverse a string in your preferred language?
• Explain the difference between a stack and a queue.
• What is a deadlock, and how can it be avoided?
Resources:
• GeeksforGeeks and LeetCode for coding practice and theory.
• Books like Cracking the Coding Interview by Gayle Laakmann McDowell.
Tips:
• Focus on problem-solving skills and code optimization.
• Be ready to explain your approach in technical questions.
4. Coding Round (If applicable)
Some Infosys roles might require you to go through a coding round. Practice coding problems related to arrays, strings, recursion, dynamic programming, and greedy algorithms.
Tools:
• HackerRank, CodeChef, and Codeforces are good platforms to practice coding challenges.
• Focus on coding efficiency and edge case handling.
5. HR Interview Preparation
In the HR round, you will be evaluated on your personality, communication skills, and cultural fit. Common questions include:
• Tell me about yourself.
• Why do you want to join Infosys?
• What are your strengths and weaknesses?
Tips:
• Prepare a structured self-introduction.
• Research Infosys’ values, projects, and recent developments to show enthusiasm for the company.
• Be honest but strategic with your answers regarding strengths and weaknesses.
6. Mock Interviews and Soft Skills
• Mock Interviews: Participate in mock interviews to simulate the real environment.
• Soft Skills: Work on clear communication and positive body language. Infosys looks for candidates who can explain technical concepts clearly.
7. Common Mistakes to Avoid
• Lack of Practice: Not practicing enough aptitude or coding questions can lead to poor performance in tests.
• Unclear Communication: Even if you know the solution, being unable to explain it well in technical interviews can hurt your chances.
• Overlooking HR Round: Many candidates prepare for technical rounds and ignore HR. Remember, HR rounds can be just as important.
8. Key Resources
• Aptitude: RS Aggarwal for Quantitative Aptitude.
• Coding: HackerRank, LeetCode.
• Technical Knowledge: GeeksforGeeks for theory and coding questions.
• Mock Tests: Websites like IndiaBix provide Infosys-specific mock tests and previous year papers.
Hope this helps you 😊
1. Understand the Interview Structure
Infosys fresher recruitment usually has three main stages:
• Aptitude Test (Written Exam)
• Technical Interview
• HR Interview
2. Aptitude Test Preparation
The first stage typically includes questions on logical reasoning, quantitative aptitude, and verbal ability. Prepare the following:
• Quantitative Aptitude: Topics include time & work, percentages, profit & loss, probability, permutations & combinations, and number series.
• Logical Reasoning: Focus on puzzles, blood relations, data interpretation, and syllogisms.
• Verbal Ability: This includes reading comprehension, sentence correction, error spotting, synonyms/antonyms, and fill-in-the-blanks.
Resources:
• Books: RS Aggarwal’s Quantitative Aptitude for quantitative topics.
• Websites: Platforms like IndiaBix or Testbook provide practice questions.
Tips:
• Practice regularly under timed conditions.
• Use mock tests to improve speed and accuracy.
• Focus on weak areas after taking a few practice tests.
3. Technical Interview Preparation
In this round, Infosys assesses your understanding of basic programming, algorithms, data structures, and other core subjects. Here’s how to prepare:
• Programming Languages: Have a solid foundation in at least one programming language (C, C++, Java, Python).
• Data Structures & Algorithms: Study key topics like arrays, linked lists, stacks, queues, trees, and sorting algorithms.
• DBMS, Operating Systems & Networks: Be prepared for basic questions on SQL, normalization, joins, process management, and networking protocols.
Sample Questions:
• How would you reverse a string in your preferred language?
• Explain the difference between a stack and a queue.
• What is a deadlock, and how can it be avoided?
Resources:
• GeeksforGeeks and LeetCode for coding practice and theory.
• Books like Cracking the Coding Interview by Gayle Laakmann McDowell.
Tips:
• Focus on problem-solving skills and code optimization.
• Be ready to explain your approach in technical questions.
4. Coding Round (If applicable)
Some Infosys roles might require you to go through a coding round. Practice coding problems related to arrays, strings, recursion, dynamic programming, and greedy algorithms.
Tools:
• HackerRank, CodeChef, and Codeforces are good platforms to practice coding challenges.
• Focus on coding efficiency and edge case handling.
5. HR Interview Preparation
In the HR round, you will be evaluated on your personality, communication skills, and cultural fit. Common questions include:
• Tell me about yourself.
• Why do you want to join Infosys?
• What are your strengths and weaknesses?
Tips:
• Prepare a structured self-introduction.
• Research Infosys’ values, projects, and recent developments to show enthusiasm for the company.
• Be honest but strategic with your answers regarding strengths and weaknesses.
6. Mock Interviews and Soft Skills
• Mock Interviews: Participate in mock interviews to simulate the real environment.
• Soft Skills: Work on clear communication and positive body language. Infosys looks for candidates who can explain technical concepts clearly.
7. Common Mistakes to Avoid
• Lack of Practice: Not practicing enough aptitude or coding questions can lead to poor performance in tests.
• Unclear Communication: Even if you know the solution, being unable to explain it well in technical interviews can hurt your chances.
• Overlooking HR Round: Many candidates prepare for technical rounds and ignore HR. Remember, HR rounds can be just as important.
8. Key Resources
• Aptitude: RS Aggarwal for Quantitative Aptitude.
• Coding: HackerRank, LeetCode.
• Technical Knowledge: GeeksforGeeks for theory and coding questions.
• Mock Tests: Websites like IndiaBix provide Infosys-specific mock tests and previous year papers.
Hope this helps you 😊
👍7❤2
Important Interview Questions
1. What is a window function in SQL? How is it different from aggregate functions?
2. Explain the use of the OVER() clause in window functions.
3. What is the purpose of the PARTITION BY clause in window functions?
4. What is the role of the ORDER BY clause in a window function?
5. What is the difference between ROW_NUMBER(), RANK(), and DENSE_RANK() window functions?
6. How do window functions differ from group functions like GROUP BY?
7. Can you use window functions with an ORDER BY clause outside of the OVER() clause? Why or why not?
8. Write a query using the ROW_NUMBER() function to assign sequential numbers to rows in a result set.
9. How does the NTILE() function work in SQL? What is its use case?
10. What is the difference between LAG() and LEAD() window functions?
Hope this helps you 😊
1. What is a window function in SQL? How is it different from aggregate functions?
2. Explain the use of the OVER() clause in window functions.
3. What is the purpose of the PARTITION BY clause in window functions?
4. What is the role of the ORDER BY clause in a window function?
5. What is the difference between ROW_NUMBER(), RANK(), and DENSE_RANK() window functions?
6. How do window functions differ from group functions like GROUP BY?
7. Can you use window functions with an ORDER BY clause outside of the OVER() clause? Why or why not?
8. Write a query using the ROW_NUMBER() function to assign sequential numbers to rows in a result set.
9. How does the NTILE() function work in SQL? What is its use case?
10. What is the difference between LAG() and LEAD() window functions?
Hope this helps you 😊
👍2❤1
Myntra interview questions for Data Analyst 2024.
1. You have a dataset with missing values. How would you use a combination of Pandas and NumPy to fill missing values based on the mean of the column?
2. How would you create a new column in a Pandas DataFrame by normalizing an existing numeric column using NumPy’s np.min() and np.max()?
3. Explain how to group a Pandas DataFrame by one column and apply a NumPy function, like np.std() (standard deviation), to each group.
4. How can you convert a time-series column in a Pandas DataFrame to NumPy’s datetime format for faster time-based calculations?
5. How would you identify and remove outliers from a Pandas DataFrame using NumPy’s Z-score method (scipy.stats.zscore)?
6. How would you use NumPy’s percentile() function to calculate specific quantiles for a numeric column in a Pandas DataFrame?
7. How would you use NumPy's polyfit() function to perform linear regression on a dataset stored in a Pandas DataFrame?
8. How can you use a combination of Pandas and NumPy to transform categorical data into dummy variables (one-hot encoding)?
9. How would you use both Pandas and NumPy to split a dataset into training and testing sets based on a random seed?
10. How can you apply NumPy's vectorize() function on a Pandas Series for better performance?
11. How would you optimize a Pandas DataFrame containing millions of rows by converting columns to NumPy arrays? Explain the benefits in terms of memory and speed.
12. How can you perform complex mathematical operations, such as matrix multiplication, using NumPy on a subset of a Pandas DataFrame?
13. Explain how you can use np.select() to perform conditional column operations in a Pandas DataFrame.
14. How can you handle time series data in Pandas and use NumPy to perform statistical analysis like rolling variance or covariance?
15. How can you integrate NumPy's random module (np.random) to generate random numbers and add them as a new column in a Pandas DataFrame?
16. Explain how you would use Pandas' applymap() function combined with NumPy’s vectorized operations to transform all elements in a DataFrame.
17. How can you apply mathematical transformations (e.g., square root, logarithm) from NumPy to specific columns in a Pandas DataFrame?
18. How would you efficiently perform element-wise operations between a Pandas DataFrame and a NumPy array of different dimensions?
19. How can you use NumPy functions like np.linalg.inv() or np.linalg.det() for linear algebra operations on numeric columns of a Pandas DataFrame?
20. Explain how you would compute the covariance matrix between multiple numeric columns of a DataFrame using NumPy.
21. What are the key differences between a Pandas DataFrame and a NumPy array? When would you use one over the other?
22. How can you convert a NumPy array into a Pandas DataFrame, and vice versa? Provide an example.
You can find the answers here
Hope this helps you 😊
1. You have a dataset with missing values. How would you use a combination of Pandas and NumPy to fill missing values based on the mean of the column?
2. How would you create a new column in a Pandas DataFrame by normalizing an existing numeric column using NumPy’s np.min() and np.max()?
3. Explain how to group a Pandas DataFrame by one column and apply a NumPy function, like np.std() (standard deviation), to each group.
4. How can you convert a time-series column in a Pandas DataFrame to NumPy’s datetime format for faster time-based calculations?
5. How would you identify and remove outliers from a Pandas DataFrame using NumPy’s Z-score method (scipy.stats.zscore)?
6. How would you use NumPy’s percentile() function to calculate specific quantiles for a numeric column in a Pandas DataFrame?
7. How would you use NumPy's polyfit() function to perform linear regression on a dataset stored in a Pandas DataFrame?
8. How can you use a combination of Pandas and NumPy to transform categorical data into dummy variables (one-hot encoding)?
9. How would you use both Pandas and NumPy to split a dataset into training and testing sets based on a random seed?
10. How can you apply NumPy's vectorize() function on a Pandas Series for better performance?
11. How would you optimize a Pandas DataFrame containing millions of rows by converting columns to NumPy arrays? Explain the benefits in terms of memory and speed.
12. How can you perform complex mathematical operations, such as matrix multiplication, using NumPy on a subset of a Pandas DataFrame?
13. Explain how you can use np.select() to perform conditional column operations in a Pandas DataFrame.
14. How can you handle time series data in Pandas and use NumPy to perform statistical analysis like rolling variance or covariance?
15. How can you integrate NumPy's random module (np.random) to generate random numbers and add them as a new column in a Pandas DataFrame?
16. Explain how you would use Pandas' applymap() function combined with NumPy’s vectorized operations to transform all elements in a DataFrame.
17. How can you apply mathematical transformations (e.g., square root, logarithm) from NumPy to specific columns in a Pandas DataFrame?
18. How would you efficiently perform element-wise operations between a Pandas DataFrame and a NumPy array of different dimensions?
19. How can you use NumPy functions like np.linalg.inv() or np.linalg.det() for linear algebra operations on numeric columns of a Pandas DataFrame?
20. Explain how you would compute the covariance matrix between multiple numeric columns of a DataFrame using NumPy.
21. What are the key differences between a Pandas DataFrame and a NumPy array? When would you use one over the other?
22. How can you convert a NumPy array into a Pandas DataFrame, and vice versa? Provide an example.
You can find the answers here
Hope this helps you 😊
👍2
SQL Interview Questions which can be asked in a Data Analyst Interview.
1️⃣ What is difference between Primary key and Unique key?
◼Primary key- A column or set of columns which uniquely identifies each record in a table. It can't contain null values and only one primary key
can exist in a table.
◼Unique key-Similar to primary key it also uniquely identifies each record in a table and can contain null values.Multiple Unique key can exist in a table.
2️⃣ What is a Candidate key?
◼A key or set of keys that uniquely identifies each record in a table.It is a combination of Primary and Alternate key.
3️⃣ What is a Constraint?
◼Specific rule or limit that we define in our table. E.g - NOT NULL,AUTO INCREMENT
4️⃣ Can you differentiate between TRUNCATE and DELETE?
◼TRUNCATE is a DDL command. It deletes the entire data from a table but preserves the structure of table.It doesn't deletes the data row by row hence faster than DELETE command, while DELETE is a DML command and it deletes the entire data based on specified condition else deletes the entire data,also it deletes the data row by row hence slower than TRUNCATE command.
5️⃣ What is difference between 'View' and 'Stored Procedure'?
◼A View is a virtual table that gets data from the base table .It is basically a Select statement,while Stored Procedure is a sql statement or set of sql statement stored on database server.
6️⃣ What is difference between a Common Table Expression and temporary table?
◼CTE is a temporary result set that is defined within execution scope of a single SELECT ,DELETE,UPDATE statement while temporary table is stored in TempDB and gets deleted once the session expires.
7️⃣ Differentiate between a clustered index and a non-clustered index?
◼ A clustered index determines physical ordering of data in a table and a table can have only one clustered index while a non-clustered index is analogous to index of a book where index is stored at one place and data at other place and index will have pointers to storage location of the data,a table can have more than one non-clustered index.
8️⃣ Explain triggers ?
◼They are sql codes which are automatically executed in response to certain events on a table.They are used to maintain integrity of data.
1️⃣ What is difference between Primary key and Unique key?
◼Primary key- A column or set of columns which uniquely identifies each record in a table. It can't contain null values and only one primary key
can exist in a table.
◼Unique key-Similar to primary key it also uniquely identifies each record in a table and can contain null values.Multiple Unique key can exist in a table.
2️⃣ What is a Candidate key?
◼A key or set of keys that uniquely identifies each record in a table.It is a combination of Primary and Alternate key.
3️⃣ What is a Constraint?
◼Specific rule or limit that we define in our table. E.g - NOT NULL,AUTO INCREMENT
4️⃣ Can you differentiate between TRUNCATE and DELETE?
◼TRUNCATE is a DDL command. It deletes the entire data from a table but preserves the structure of table.It doesn't deletes the data row by row hence faster than DELETE command, while DELETE is a DML command and it deletes the entire data based on specified condition else deletes the entire data,also it deletes the data row by row hence slower than TRUNCATE command.
5️⃣ What is difference between 'View' and 'Stored Procedure'?
◼A View is a virtual table that gets data from the base table .It is basically a Select statement,while Stored Procedure is a sql statement or set of sql statement stored on database server.
6️⃣ What is difference between a Common Table Expression and temporary table?
◼CTE is a temporary result set that is defined within execution scope of a single SELECT ,DELETE,UPDATE statement while temporary table is stored in TempDB and gets deleted once the session expires.
7️⃣ Differentiate between a clustered index and a non-clustered index?
◼ A clustered index determines physical ordering of data in a table and a table can have only one clustered index while a non-clustered index is analogous to index of a book where index is stored at one place and data at other place and index will have pointers to storage location of the data,a table can have more than one non-clustered index.
8️⃣ Explain triggers ?
◼They are sql codes which are automatically executed in response to certain events on a table.They are used to maintain integrity of data.
👏1
Recent Interview Question for Data Analyst Role
Question 1) You have two tables:
Employee:-
Columns: EID (Employee ID), ESalary (Employee Salary)
empdetails:-
Columns: EID (Employee ID), EDOB (Employee Date of Birth)
Your task is to:
1) Identify all employees whose salary (ESalary) is an odd number?
2) Retrieve the date of birth (EDOB) for these employees from the empdetails table.
How would you write a SQL query to achieve this?
SELECT e.EID, ed.EDOB
FROM (
SELECT EID
FROM Employee
WHERE ESalary % 2 <> 0
) e
JOIN empdetails ed ON e.EID = ed.EID;
Explanation of the query :-
Filter Employees with Odd Salaries:
The subquery SELECT EID FROM Employee WHERE ESalary % 2 <> 0 filters out Employee IDs (EID) where the salary (ESalary) is an odd number. The modulo operator % checks if ESalary divided by 2 leaves a remainder (<>0).
Merge with empdetails:
The main query then takes the filtered Employee IDs from the subquery and performs a join with the empdetails table using the EID column. This retrieves the date of birth (EDOB) for these employees.
Hope this helps you 😊
Question 1) You have two tables:
Employee:-
Columns: EID (Employee ID), ESalary (Employee Salary)
empdetails:-
Columns: EID (Employee ID), EDOB (Employee Date of Birth)
Your task is to:
1) Identify all employees whose salary (ESalary) is an odd number?
2) Retrieve the date of birth (EDOB) for these employees from the empdetails table.
How would you write a SQL query to achieve this?
SELECT e.EID, ed.EDOB
FROM (
SELECT EID
FROM Employee
WHERE ESalary % 2 <> 0
) e
JOIN empdetails ed ON e.EID = ed.EID;
Explanation of the query :-
Filter Employees with Odd Salaries:
The subquery SELECT EID FROM Employee WHERE ESalary % 2 <> 0 filters out Employee IDs (EID) where the salary (ESalary) is an odd number. The modulo operator % checks if ESalary divided by 2 leaves a remainder (<>0).
Merge with empdetails:
The main query then takes the filtered Employee IDs from the subquery and performs a join with the empdetails table using the EID column. This retrieves the date of birth (EDOB) for these employees.
Hope this helps you 😊
👍3
Data Analyst Interview Questions with Answers 👇👇
Self-Introduction (2-3 minutes)
"Hello, my name is Rahul Sharma, and I'm excited to be here today. With a degree in Computer Science, I've developed strong analytical skills and a passion for data analysis. Over the past 2-3 years, I've worked as a Data Analyst, primarily focusing on data visualization, SQL development, and business intelligence. My expertise includes SQL Server, Power BI, and data modeling."
Explain Your Last Project (5-7 minutes)
"In my previous role at ABC Corporation, I worked on a project to analyze customer purchasing behavior. The goal was to identify trends and preferences, informing marketing strategies.
"My responsibilities included:
• Data extraction from SQL Server
• Data visualization using Power BI
• Data modeling and normalization
• Stakeholder communication
"Some challenges I faced included:
• Handling large datasets
• Ensuring data quality and accuracy
• Meeting tight deadlines
"To overcome these challenges, I:
• Optimized SQL queries for faster data retrieval
• Implemented data validation checks
• Collaborated closely with stakeholders"
Challenges You Faced (3-5 minutes)
"Two significant challenges I faced were:
1. Data quality issues due to inconsistent formatting.
Resolution: I developed a data cleaning noscript using SQL and implemented data validation checks.
1. Performance issues with Power BI reports.
Resolution: I optimized data models, reduced data redundancy, and leveraged Power BI's built-in performance optimization features."
Your Roles and Responsibilities (3-5 minutes)
"As a Data Analyst at ABC Corporation, my primary responsibilities included:
• Data extraction and analysis
• Data visualization and reporting
• Stakeholder communication and presentation
• Data modeling and normalization
"I worked closely with cross-functional teams to ensure data-driven insights informed business decisions."
2 Issues You Got Stuck and How You Resolved (5-7 minutes)
"Two issues I got stuck on were:
1. Optimizing a slow-running SQL query.
Resolution: I analyzed the query execution plan, applied indexing strategies, and rewrote the query to reduce join operations.
1. Troubleshooting Power BI visualization issues.
Resolution: I adjusted data model settings, validated data integrity, and leveraged Power BI's community forums for support."
How Did You Do Optimization (3-5 minutes)
"To optimize query performance:
• I analyzed query execution plans
• Applied indexing strategies
• Rewrote queries to reduce join operations
• Utilized data caching
Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
Self-Introduction (2-3 minutes)
"Hello, my name is Rahul Sharma, and I'm excited to be here today. With a degree in Computer Science, I've developed strong analytical skills and a passion for data analysis. Over the past 2-3 years, I've worked as a Data Analyst, primarily focusing on data visualization, SQL development, and business intelligence. My expertise includes SQL Server, Power BI, and data modeling."
Explain Your Last Project (5-7 minutes)
"In my previous role at ABC Corporation, I worked on a project to analyze customer purchasing behavior. The goal was to identify trends and preferences, informing marketing strategies.
"My responsibilities included:
• Data extraction from SQL Server
• Data visualization using Power BI
• Data modeling and normalization
• Stakeholder communication
"Some challenges I faced included:
• Handling large datasets
• Ensuring data quality and accuracy
• Meeting tight deadlines
"To overcome these challenges, I:
• Optimized SQL queries for faster data retrieval
• Implemented data validation checks
• Collaborated closely with stakeholders"
Challenges You Faced (3-5 minutes)
"Two significant challenges I faced were:
1. Data quality issues due to inconsistent formatting.
Resolution: I developed a data cleaning noscript using SQL and implemented data validation checks.
1. Performance issues with Power BI reports.
Resolution: I optimized data models, reduced data redundancy, and leveraged Power BI's built-in performance optimization features."
Your Roles and Responsibilities (3-5 minutes)
"As a Data Analyst at ABC Corporation, my primary responsibilities included:
• Data extraction and analysis
• Data visualization and reporting
• Stakeholder communication and presentation
• Data modeling and normalization
"I worked closely with cross-functional teams to ensure data-driven insights informed business decisions."
2 Issues You Got Stuck and How You Resolved (5-7 minutes)
"Two issues I got stuck on were:
1. Optimizing a slow-running SQL query.
Resolution: I analyzed the query execution plan, applied indexing strategies, and rewrote the query to reduce join operations.
1. Troubleshooting Power BI visualization issues.
Resolution: I adjusted data model settings, validated data integrity, and leveraged Power BI's community forums for support."
How Did You Do Optimization (3-5 minutes)
"To optimize query performance:
• I analyzed query execution plans
• Applied indexing strategies
• Rewrote queries to reduce join operations
• Utilized data caching
Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
👍6❤2
5⃣ frequently Asked SQL Interview Questions with Answers in data analyst interviews
📍1. Write a SQL query to find the average purchase amount for each customer. Assume you have two tables: Customers (CustomerID, Name) and Orders (OrderID, CustomerID, Amount).
📍2. Write a query to find the employee with the minimum salary in each department from a table Employees with columns EmployeeID, Name, DepartmentID, and Salary.
📍3. Write a SQL query to find all products that have never been sold. Assume you have a table Products (ProductID, ProductName) and a table Sales (SaleID, ProductID, Quantity).
📍4. Given a table Orders with columns OrderID, CustomerID, OrderDate, and a table OrderItems with columns OrderID, ItemID, Quantity, write a query to find the customer with the highest total order quantity.
;
📍5. Write a SQL query to find the earliest order date for each customer from a table Orders (OrderID, CustomerID, OrderDate).
Hope it helps :)
📍1. Write a SQL query to find the average purchase amount for each customer. Assume you have two tables: Customers (CustomerID, Name) and Orders (OrderID, CustomerID, Amount).
SELECT c.CustomerID, c. Name, AVG(o.Amount) AS AveragePurchase
FROM Customers c
JOIN Orders o ON c.CustomerID = o.CustomerID
GROUP BY c.CustomerID, c. Name;
📍2. Write a query to find the employee with the minimum salary in each department from a table Employees with columns EmployeeID, Name, DepartmentID, and Salary.
SELECT e1.DepartmentID, e1.EmployeeID, e1 .Name, e1.Salary
FROM Employees e1
WHERE Salary = (SELECT MIN(Salary) FROM Employees e2 WHERE e2.DepartmentID = e1.DepartmentID);
📍3. Write a SQL query to find all products that have never been sold. Assume you have a table Products (ProductID, ProductName) and a table Sales (SaleID, ProductID, Quantity).
SELECT p.ProductID, p.ProductName
FROM Products p
LEFT JOIN Sales s ON p.ProductID = s.ProductID
WHERE s.ProductID IS NULL;
📍4. Given a table Orders with columns OrderID, CustomerID, OrderDate, and a table OrderItems with columns OrderID, ItemID, Quantity, write a query to find the customer with the highest total order quantity.
SELECT o.CustomerID, SUM(oi.Quantity) AS TotalQuantity
FROM Orders o
JOIN OrderItems oi ON o.OrderID = oi.OrderID
GROUP BY o.CustomerID
ORDER BY TotalQuantity DESC
LIMIT 1
;
📍5. Write a SQL query to find the earliest order date for each customer from a table Orders (OrderID, CustomerID, OrderDate).
SELECT CustomerID, MIN(OrderDate) AS EarliestOrderDate
FROM Orders
GROUP BY CustomerID
Hope it helps :)
👍6❤1
SQL table interview questions:
1. What is a DUAL table and why do we need it?
- it is a special table which gets created automatically when we install Oracle database. It can be used to select pseudo columns, perform calculations and also as sequence generator etc.
2. How many columns and rows are present in DUAL table?
- one column & one row by default.
3. Can we insert more rows in to DUAL table?
- Yes.
4. What's the easiest wah to backup a table / how can we create a table based on existing table?
- CREATE TABLE SALES_COPY AS SELECT * FROM SALES.
5. Can we drop all the columns from a table?
- No.
6. What is the difference between count(1) and count(*)?
- Both are same. Both consume same amount of resources, Both perform same operation
1. What is a DUAL table and why do we need it?
- it is a special table which gets created automatically when we install Oracle database. It can be used to select pseudo columns, perform calculations and also as sequence generator etc.
2. How many columns and rows are present in DUAL table?
- one column & one row by default.
3. Can we insert more rows in to DUAL table?
- Yes.
4. What's the easiest wah to backup a table / how can we create a table based on existing table?
- CREATE TABLE SALES_COPY AS SELECT * FROM SALES.
5. Can we drop all the columns from a table?
- No.
6. What is the difference between count(1) and count(*)?
- Both are same. Both consume same amount of resources, Both perform same operation
👍3
Most Important Python Topics for Data Analyst Interview:
#Basics of Python:
1. Data Types
2. Lists
3. Dictionaries
4. Control Structures:
- if-elif-else
- Loops
5. Functions
6. Practice basic FAQs questions, below mentioned are few examples:
- How to reverse a string in Python?
- How to find the largest/smallest number in a list?
- How to remove duplicates from a list?
- How to count the occurrences of each element in a list?
- How to check if a string is a palindrome?
#Pandas:
1. Pandas Data Structures (Series, DataFrame)
2. Creating and Manipulating DataFrames
3. Filtering and Selecting Data
4. Grouping and Aggregating Data
5. Handling Missing Values
6. Merging and Joining DataFrames
7. Adding and Removing Columns
8. Exploratory Data Analysis (EDA):
- Denoscriptive Statistics
- Data Visualization with Pandas (Line Plots, Bar Plots, Histograms)
- Correlation and Covariance
- Handling Duplicates
- Data Transformation
#Numpy:
1. NumPy Arrays
2. Array Operations:
- Creating Arrays
- Slicing and Indexing
- Arithmetic Operations
Integration with Other Libraries:
1. Basic Data Visualization with Pandas (Line Plots, Bar Plots)
Key Concepts to Revise:
1. Data Manipulation with Pandas and NumPy
2. Data Cleaning Techniques
3. File Handling (reading and writing CSV files, JSON files)
4. Handling Missing and Duplicate Values
5. Data Transformation (scaling, normalization)
6. Data Aggregation and Group Operations
7. Combining and Merging Datasets
Hope this helps you 😊
#Basics of Python:
1. Data Types
2. Lists
3. Dictionaries
4. Control Structures:
- if-elif-else
- Loops
5. Functions
6. Practice basic FAQs questions, below mentioned are few examples:
- How to reverse a string in Python?
- How to find the largest/smallest number in a list?
- How to remove duplicates from a list?
- How to count the occurrences of each element in a list?
- How to check if a string is a palindrome?
#Pandas:
1. Pandas Data Structures (Series, DataFrame)
2. Creating and Manipulating DataFrames
3. Filtering and Selecting Data
4. Grouping and Aggregating Data
5. Handling Missing Values
6. Merging and Joining DataFrames
7. Adding and Removing Columns
8. Exploratory Data Analysis (EDA):
- Denoscriptive Statistics
- Data Visualization with Pandas (Line Plots, Bar Plots, Histograms)
- Correlation and Covariance
- Handling Duplicates
- Data Transformation
#Numpy:
1. NumPy Arrays
2. Array Operations:
- Creating Arrays
- Slicing and Indexing
- Arithmetic Operations
Integration with Other Libraries:
1. Basic Data Visualization with Pandas (Line Plots, Bar Plots)
Key Concepts to Revise:
1. Data Manipulation with Pandas and NumPy
2. Data Cleaning Techniques
3. File Handling (reading and writing CSV files, JSON files)
4. Handling Missing and Duplicate Values
5. Data Transformation (scaling, normalization)
6. Data Aggregation and Group Operations
7. Combining and Merging Datasets
Hope this helps you 😊
👍5
Preparing for an online data analyst interview? Here’s a complete guide to ensure you’re ready to impress:
1. Mental Preparation
Visualize Success: Imagine yourself confidently answering questions and solving problems.
Stay Calm: Practice relaxation techniques like deep breathing or meditation to manage interview stress.
Set Clear Goals: Define what you aim to achieve and focus on showcasing your strengths.
2. Technical Setup
Check Your Equipment: Test your computer, camera, microphone, and internet connection to avoid technical glitches.
Platform Familiarity: Familiarize yourself with the video conferencing tool (Zoom, Teams, etc.) and ensure it’s updated.
Professional Background: Choose a clean, well-lit space or use a virtual background if necessary.
3. Environment
Quiet Space: Select a quiet room free from interruptions and let others know about your interview schedule.
Lighting and Camera: Position your camera at eye level and ensure you’re well-lit from the front to avoid shadows.
4. Interview Preparation
Review Key Concepts: Brush up on SQL, data manipulation, and visualization tools relevant to the role.
Practice with Online Tools: Get comfortable with online whiteboards or screen-sharing features if they’ll be used.
Prepare Your Questions: Develop insightful questions about the role, team, and company.
5. Day Before the Interview
Test Your Setup: Conduct a trial run with a friend or family member to ensure everything works smoothly.
Organize Documents: Have your resume, cover letter, and any required documents easily accessible on your computer.
Dress Professionally: Choose professional attire to set the right tone and boost your confidence.
6. Interview Day
Log in Early: Join the meeting a few minutes early to resolve any last-minute issues and show punctuality.
Engage Actively: Maintain eye contact by looking at the camera, and engage thoughtfully with the interviewer.
Data Analytics Resources
👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
1. Mental Preparation
Visualize Success: Imagine yourself confidently answering questions and solving problems.
Stay Calm: Practice relaxation techniques like deep breathing or meditation to manage interview stress.
Set Clear Goals: Define what you aim to achieve and focus on showcasing your strengths.
2. Technical Setup
Check Your Equipment: Test your computer, camera, microphone, and internet connection to avoid technical glitches.
Platform Familiarity: Familiarize yourself with the video conferencing tool (Zoom, Teams, etc.) and ensure it’s updated.
Professional Background: Choose a clean, well-lit space or use a virtual background if necessary.
3. Environment
Quiet Space: Select a quiet room free from interruptions and let others know about your interview schedule.
Lighting and Camera: Position your camera at eye level and ensure you’re well-lit from the front to avoid shadows.
4. Interview Preparation
Review Key Concepts: Brush up on SQL, data manipulation, and visualization tools relevant to the role.
Practice with Online Tools: Get comfortable with online whiteboards or screen-sharing features if they’ll be used.
Prepare Your Questions: Develop insightful questions about the role, team, and company.
5. Day Before the Interview
Test Your Setup: Conduct a trial run with a friend or family member to ensure everything works smoothly.
Organize Documents: Have your resume, cover letter, and any required documents easily accessible on your computer.
Dress Professionally: Choose professional attire to set the right tone and boost your confidence.
6. Interview Day
Log in Early: Join the meeting a few minutes early to resolve any last-minute issues and show punctuality.
Engage Actively: Maintain eye contact by looking at the camera, and engage thoughtfully with the interviewer.
Data Analytics Resources
👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
👍2
5 frequently Asked SQL Interview Questions with Answers in Data Engineering interviews:
𝐃𝐢𝐟𝐟𝐢𝐜𝐮𝐥𝐭𝐲 - 𝐌𝐞𝐝𝐢𝐮𝐦
⚫️Determine the Top 5 Products with the Highest Revenue in Each Category.
Schema: Products (ProductID, Name, CategoryID), Sales (SaleID, ProductID, Amount)
WITH ProductRevenue AS (
SELECT p.ProductID,
p.Name,
p.CategoryID,
SUM(s.Amount) AS TotalRevenue,
RANK() OVER (PARTITION BY p.CategoryID ORDER BY SUM(s.Amount) DESC) AS RevenueRank
FROM Products p
JOIN Sales s ON p.ProductID = s.ProductID
GROUP BY p.ProductID, p.Name, p.CategoryID
)
SELECT ProductID, Name, CategoryID, TotalRevenue
FROM ProductRevenue
WHERE RevenueRank <= 5;
⚫️ Identify Employees with Increasing Sales for Four Consecutive Quarters.
Schema: Sales (EmployeeID, SaleDate, Amount)
WITH QuarterlySales AS (
SELECT EmployeeID,
DATE_TRUNC('quarter', SaleDate) AS Quarter,
SUM(Amount) AS QuarterlyAmount
FROM Sales
GROUP BY EmployeeID, DATE_TRUNC('quarter', SaleDate)
),
SalesTrend AS (
SELECT EmployeeID,
Quarter,
QuarterlyAmount,
LAG(QuarterlyAmount, 1) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter1,
LAG(QuarterlyAmount, 2) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter2,
LAG(QuarterlyAmount, 3) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter3
FROM QuarterlySales
)
SELECT EmployeeID, Quarter, QuarterlyAmount
FROM SalesTrend
WHERE QuarterlyAmount > PrevQuarter1 AND PrevQuarter1 > PrevQuarter2 AND PrevQuarter2 > PrevQuarter3;
⚫️ List Customers Who Made Purchases in Each of the Last Three Years.
Schema: Orders (OrderID, CustomerID, OrderDate)
WITH YearlyOrders AS (
SELECT CustomerID,
EXTRACT(YEAR FROM OrderDate) AS OrderYear
FROM Orders
GROUP BY CustomerID, EXTRACT(YEAR FROM OrderDate)
),
RecentYears AS (
SELECT DISTINCT OrderYear
FROM Orders
WHERE OrderDate >= CURRENT_DATE - INTERVAL '3 years'
),
CustomerYearlyOrders AS (
SELECT CustomerID,
COUNT(DISTINCT OrderYear) AS YearCount
FROM YearlyOrders
WHERE OrderYear IN (SELECT OrderYear FROM RecentYears)
GROUP BY CustomerID
)
SELECT CustomerID
FROM CustomerYearlyOrders
WHERE YearCount = 3;
⚫️ Find the Third Lowest Price for Each Product Category.
Schema: Products (ProductID, Name, CategoryID, Price)
WITH RankedPrices AS (
SELECT CategoryID,
Price,
DENSE_RANK() OVER (PARTITION BY CategoryID ORDER BY Price ASC) AS PriceRank
FROM Products
)
SELECT CategoryID, Price
FROM RankedPrices
WHERE PriceRank = 3;
⚫️ Identify Products with Total Sales Exceeding a Specified Threshold Over the Last 30 Days.
Schema: Sales (SaleID, ProductID, SaleDate, Amount)
WITH RecentSales AS (
SELECT ProductID,
SUM(Amount) AS TotalSales
FROM Sales
WHERE SaleDate >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY ProductID
)
SELECT ProductID, TotalSales
FROM RecentSales
WHERE TotalSales > 200;
Here you can find essential Interview Resources👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Like this post if you need more 👍❤️
Hope it helps :)
𝐃𝐢𝐟𝐟𝐢𝐜𝐮𝐥𝐭𝐲 - 𝐌𝐞𝐝𝐢𝐮𝐦
⚫️Determine the Top 5 Products with the Highest Revenue in Each Category.
Schema: Products (ProductID, Name, CategoryID), Sales (SaleID, ProductID, Amount)
WITH ProductRevenue AS (
SELECT p.ProductID,
p.Name,
p.CategoryID,
SUM(s.Amount) AS TotalRevenue,
RANK() OVER (PARTITION BY p.CategoryID ORDER BY SUM(s.Amount) DESC) AS RevenueRank
FROM Products p
JOIN Sales s ON p.ProductID = s.ProductID
GROUP BY p.ProductID, p.Name, p.CategoryID
)
SELECT ProductID, Name, CategoryID, TotalRevenue
FROM ProductRevenue
WHERE RevenueRank <= 5;
⚫️ Identify Employees with Increasing Sales for Four Consecutive Quarters.
Schema: Sales (EmployeeID, SaleDate, Amount)
WITH QuarterlySales AS (
SELECT EmployeeID,
DATE_TRUNC('quarter', SaleDate) AS Quarter,
SUM(Amount) AS QuarterlyAmount
FROM Sales
GROUP BY EmployeeID, DATE_TRUNC('quarter', SaleDate)
),
SalesTrend AS (
SELECT EmployeeID,
Quarter,
QuarterlyAmount,
LAG(QuarterlyAmount, 1) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter1,
LAG(QuarterlyAmount, 2) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter2,
LAG(QuarterlyAmount, 3) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter3
FROM QuarterlySales
)
SELECT EmployeeID, Quarter, QuarterlyAmount
FROM SalesTrend
WHERE QuarterlyAmount > PrevQuarter1 AND PrevQuarter1 > PrevQuarter2 AND PrevQuarter2 > PrevQuarter3;
⚫️ List Customers Who Made Purchases in Each of the Last Three Years.
Schema: Orders (OrderID, CustomerID, OrderDate)
WITH YearlyOrders AS (
SELECT CustomerID,
EXTRACT(YEAR FROM OrderDate) AS OrderYear
FROM Orders
GROUP BY CustomerID, EXTRACT(YEAR FROM OrderDate)
),
RecentYears AS (
SELECT DISTINCT OrderYear
FROM Orders
WHERE OrderDate >= CURRENT_DATE - INTERVAL '3 years'
),
CustomerYearlyOrders AS (
SELECT CustomerID,
COUNT(DISTINCT OrderYear) AS YearCount
FROM YearlyOrders
WHERE OrderYear IN (SELECT OrderYear FROM RecentYears)
GROUP BY CustomerID
)
SELECT CustomerID
FROM CustomerYearlyOrders
WHERE YearCount = 3;
⚫️ Find the Third Lowest Price for Each Product Category.
Schema: Products (ProductID, Name, CategoryID, Price)
WITH RankedPrices AS (
SELECT CategoryID,
Price,
DENSE_RANK() OVER (PARTITION BY CategoryID ORDER BY Price ASC) AS PriceRank
FROM Products
)
SELECT CategoryID, Price
FROM RankedPrices
WHERE PriceRank = 3;
⚫️ Identify Products with Total Sales Exceeding a Specified Threshold Over the Last 30 Days.
Schema: Sales (SaleID, ProductID, SaleDate, Amount)
WITH RecentSales AS (
SELECT ProductID,
SUM(Amount) AS TotalSales
FROM Sales
WHERE SaleDate >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY ProductID
)
SELECT ProductID, TotalSales
FROM RecentSales
WHERE TotalSales > 200;
Here you can find essential Interview Resources👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Like this post if you need more 👍❤️
Hope it helps :)
❤1👍1
Almost everyone knows that these are the tools a Data Analyst works with:
➡️ SQL
➡️ Excel
➡️ Power BI/Tableau
➡️ Python
But people getting started with analytics are confused about the preferences of picking these tools.
There are various kinds of data analytics roles available in the market :
➡️ BI + SQL: Will primarily be involved in BI development.
➡️ SQL + Excel: Will primarily work on Excel reporting.
➡️ SQL + Python: Will primarily do data analysis using python.
Now, If you are getting started with learning analytics, choose any one role that interests you the most and focus on completing the primary tools that the role requires. Learn them VERY WELL.
Learn any of the above combinations that interests you first and then start looking out for opportunities which ask for these primary tools and simultaneously start learning the basics of the 3rd tool.
You don't have to focus on being good with each and every tool but being good with any of the above combinations always works.
Hope this helps you 😊
➡️ SQL
➡️ Excel
➡️ Power BI/Tableau
➡️ Python
But people getting started with analytics are confused about the preferences of picking these tools.
There are various kinds of data analytics roles available in the market :
➡️ BI + SQL: Will primarily be involved in BI development.
➡️ SQL + Excel: Will primarily work on Excel reporting.
➡️ SQL + Python: Will primarily do data analysis using python.
Now, If you are getting started with learning analytics, choose any one role that interests you the most and focus on completing the primary tools that the role requires. Learn them VERY WELL.
Learn any of the above combinations that interests you first and then start looking out for opportunities which ask for these primary tools and simultaneously start learning the basics of the 3rd tool.
You don't have to focus on being good with each and every tool but being good with any of the above combinations always works.
Hope this helps you 😊
👍4
Most Asked SQL Interview Questions at MAANG Companies🔥🔥
Preparing for an SQL Interview at MAANG Companies? Here are some crucial SQL Questions you should be ready to tackle:
1. How do you retrieve all columns from a table?
SELECT * FROM table_name;
2. What SQL statement is used to filter records?
SELECT * FROM table_name
WHERE condition;
The WHERE clause is used to filter records based on a specified condition.
3. How can you join multiple tables? Describe different types of JOINs.
SELECT columns
FROM table1
JOIN table2 ON table1.column = table2.column
JOIN table3 ON table2.column = table3.column;
Types of JOINs:
1. INNER JOIN: Returns records with matching values in both tables
SELECT * FROM table1
INNER JOIN table2 ON table1.column = table2.column;
2. LEFT JOIN: Returns all records from the left table & matched records from the right table. Unmatched records will have NULL values.
SELECT * FROM table1
LEFT JOIN table2 ON table1.column = table2.column;
3. RIGHT JOIN: Returns all records from the right table & matched records from the left table. Unmatched records will have NULL values.
SELECT * FROM table1
RIGHT JOIN table2 ON table1.column = table2.column;
4. FULL JOIN: Returns records when there is a match in either left or right table. Unmatched records will have NULL values.
SELECT * FROM table1
FULL JOIN table2 ON table1.column = table2.column;
4. What is the difference between WHERE & HAVING clauses?
WHERE: Filters records before any groupings are made.
SELECT * FROM table_name
WHERE condition;
HAVING: Filters records after groupings are made.
SELECT column, COUNT(*)
FROM table_name
GROUP BY column
HAVING COUNT(*) > value;
5. How do you calculate average, sum, minimum & maximum values in a column?
Average: SELECT AVG(column_name) FROM table_name;
Sum: SELECT SUM(column_name) FROM table_name;
Minimum: SELECT MIN(column_name) FROM table_name;
Maximum: SELECT MAX(column_name) FROM table_name;
Here you can find essential SQL Interview Resources👇
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more 👍❤️
Hope it helps :)
Preparing for an SQL Interview at MAANG Companies? Here are some crucial SQL Questions you should be ready to tackle:
1. How do you retrieve all columns from a table?
SELECT * FROM table_name;
2. What SQL statement is used to filter records?
SELECT * FROM table_name
WHERE condition;
The WHERE clause is used to filter records based on a specified condition.
3. How can you join multiple tables? Describe different types of JOINs.
SELECT columns
FROM table1
JOIN table2 ON table1.column = table2.column
JOIN table3 ON table2.column = table3.column;
Types of JOINs:
1. INNER JOIN: Returns records with matching values in both tables
SELECT * FROM table1
INNER JOIN table2 ON table1.column = table2.column;
2. LEFT JOIN: Returns all records from the left table & matched records from the right table. Unmatched records will have NULL values.
SELECT * FROM table1
LEFT JOIN table2 ON table1.column = table2.column;
3. RIGHT JOIN: Returns all records from the right table & matched records from the left table. Unmatched records will have NULL values.
SELECT * FROM table1
RIGHT JOIN table2 ON table1.column = table2.column;
4. FULL JOIN: Returns records when there is a match in either left or right table. Unmatched records will have NULL values.
SELECT * FROM table1
FULL JOIN table2 ON table1.column = table2.column;
4. What is the difference between WHERE & HAVING clauses?
WHERE: Filters records before any groupings are made.
SELECT * FROM table_name
WHERE condition;
HAVING: Filters records after groupings are made.
SELECT column, COUNT(*)
FROM table_name
GROUP BY column
HAVING COUNT(*) > value;
5. How do you calculate average, sum, minimum & maximum values in a column?
Average: SELECT AVG(column_name) FROM table_name;
Sum: SELECT SUM(column_name) FROM table_name;
Minimum: SELECT MIN(column_name) FROM table_name;
Maximum: SELECT MAX(column_name) FROM table_name;
Here you can find essential SQL Interview Resources👇
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more 👍❤️
Hope it helps :)
👍5❤1
Must Study: Key Questions for Data Analysts 4.0
Advanced SQL
1. How do you handle hierarchical data and perform recursive queries in SQL?
2. What are common techniques for SQL performance tuning beyond indexing?
3. How do you implement SQL transactions and ensure atomicity in complex queries?
Excel Advanced
1. How do you use Power Pivot to manage and analyze large datasets in Excel?
2. What are the best practices for creating and using Excel macros for automation?
3. How do you leverage Excel’s advanced charting tools for dynamic data visualization?
Power BI
1. How do you use Power Query to merge and transform data from multiple sources?
2. What are the key differences between calculated columns and measures in Power BI?
3. How do you design effective Power BI dashboards for executive reporting?
Python
1. How do you use Python’s pandas library for advanced data manipulation and analysis?
2. What are the best practices for deploying machine learning models using Python?
3. How do you perform time series analysis and forecasting with Python?
Data Visualization
1. How do you ensure your visualizations are accessible to people with visual impairments?
2. What are effective methods for visualizing multivariate data?
3. How do you use storytelling techniques to make your data visualizations more engaging?
Soft Skills
1. How do you handle conflicts and disagreements within a data team or with stakeholders?
2. What strategies do you use to effectively present complex data insights to a broad audience?
3. How do you stay updated with the latest trends and tools in data analytics?
I have curated Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
Advanced SQL
1. How do you handle hierarchical data and perform recursive queries in SQL?
2. What are common techniques for SQL performance tuning beyond indexing?
3. How do you implement SQL transactions and ensure atomicity in complex queries?
Excel Advanced
1. How do you use Power Pivot to manage and analyze large datasets in Excel?
2. What are the best practices for creating and using Excel macros for automation?
3. How do you leverage Excel’s advanced charting tools for dynamic data visualization?
Power BI
1. How do you use Power Query to merge and transform data from multiple sources?
2. What are the key differences between calculated columns and measures in Power BI?
3. How do you design effective Power BI dashboards for executive reporting?
Python
1. How do you use Python’s pandas library for advanced data manipulation and analysis?
2. What are the best practices for deploying machine learning models using Python?
3. How do you perform time series analysis and forecasting with Python?
Data Visualization
1. How do you ensure your visualizations are accessible to people with visual impairments?
2. What are effective methods for visualizing multivariate data?
3. How do you use storytelling techniques to make your data visualizations more engaging?
Soft Skills
1. How do you handle conflicts and disagreements within a data team or with stakeholders?
2. What strategies do you use to effectively present complex data insights to a broad audience?
3. How do you stay updated with the latest trends and tools in data analytics?
I have curated Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
👍2❤1
Final Preparation Guide for Data Analytics Interviews: (IMP)
➡Key SQL Concepts:
- Master SELECT statements, focusing on WHERE, ORDER BY, GROUP BY, and HAVING clauses.
- Understand the basics of JOINS: INNER, LEFT, RIGHT, FULL.
- Get comfortable with aggregate functions like COUNT, SUM, AVG, MAX, and MIN.
- Study subqueries and Common Table Expressions.
- Explore advanced topics like CASE statements, complex JOIN strategies, and Window functions (OVER, PARTITION BY, ROW_NUMBER, RANK).
➡Python for Data Analysis:
- Review the basics of Python syntax, control structures, and data structures (lists, dictionaries).
- Dive into data manipulation using Pandas and NumPy, covering DataFrames, Series, and group by operations.
- Learn basic plotting techniques with Matplotlib and Seaborn for data visualization.
➡ Excel Skills:
- Practice cell operations and essential formulas like SUMIFS, COUNTIFS, and AVERAGEIFS.
- Familiarize yourself with PivotTables, PivotCharts, data validation, and What-if analysis.
- Explore advanced formulas and work with the Data Model & Power Pivot.
➡ Power BI Proficiency:
- Focus on data modeling, including importing data and managing relationships.
- Learn data transformation techniques with Power Query and use DAX for calculated columns and measures.
- Create interactive reports and dashboards, and work on visualizations.
➡ Basic Statistics:
- Understand fundamental concepts like Mean, Median, Mode, Standard Deviation, and Variance.
- Study probability distributions, Hypothesis Testing, and P-values.
- Learn about Confidence Intervals, Correlation, and Simple Linear Regression.
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
➡Key SQL Concepts:
- Master SELECT statements, focusing on WHERE, ORDER BY, GROUP BY, and HAVING clauses.
- Understand the basics of JOINS: INNER, LEFT, RIGHT, FULL.
- Get comfortable with aggregate functions like COUNT, SUM, AVG, MAX, and MIN.
- Study subqueries and Common Table Expressions.
- Explore advanced topics like CASE statements, complex JOIN strategies, and Window functions (OVER, PARTITION BY, ROW_NUMBER, RANK).
➡Python for Data Analysis:
- Review the basics of Python syntax, control structures, and data structures (lists, dictionaries).
- Dive into data manipulation using Pandas and NumPy, covering DataFrames, Series, and group by operations.
- Learn basic plotting techniques with Matplotlib and Seaborn for data visualization.
➡ Excel Skills:
- Practice cell operations and essential formulas like SUMIFS, COUNTIFS, and AVERAGEIFS.
- Familiarize yourself with PivotTables, PivotCharts, data validation, and What-if analysis.
- Explore advanced formulas and work with the Data Model & Power Pivot.
➡ Power BI Proficiency:
- Focus on data modeling, including importing data and managing relationships.
- Learn data transformation techniques with Power Query and use DAX for calculated columns and measures.
- Create interactive reports and dashboards, and work on visualizations.
➡ Basic Statistics:
- Understand fundamental concepts like Mean, Median, Mode, Standard Deviation, and Variance.
- Study probability distributions, Hypothesis Testing, and P-values.
- Learn about Confidence Intervals, Correlation, and Simple Linear Regression.
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
DATA ANALYST Interview Questions (0-3 yr) (SQL, Power BI)
👉 Power BI:
Q1: Explain step-by-step how you will create a sales dashboard from scratch.
Q2: Explain how you can optimize a slow Power BI report.
Q3: Explain Any 5 Chart Types and Their Uses in Representing Different Aspects of Data.
👉SQL:
Q1: Explain the difference between RANK(), DENSE_RANK(), and ROW_NUMBER() functions using example.
Q2 – Q4 use Table: employee (EmpID, ManagerID, JoinDate, Dept, Salary)
Q2: Find the nth highest salary from the Employee table.
Q3: You have an employee table with employee ID and manager ID. Find all employees under a specific manager, including their subordinates at any level.
Q4: Write a query to find the cumulative salary of employees department-wise, who have joined the company in the last 30 days.
Q5: Find the top 2 customers with the highest order amount for each product category, handling ties appropriately. Table: Customer (CustomerID, ProductCategory, OrderAmount)
👉Behavioral:
Q1: Why do you want to become a data analyst and why did you apply to this company?
Q2: Describe a time when you had to manage a difficult task with tight deadlines. How did you handle it?
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
👉 Power BI:
Q1: Explain step-by-step how you will create a sales dashboard from scratch.
Q2: Explain how you can optimize a slow Power BI report.
Q3: Explain Any 5 Chart Types and Their Uses in Representing Different Aspects of Data.
👉SQL:
Q1: Explain the difference between RANK(), DENSE_RANK(), and ROW_NUMBER() functions using example.
Q2 – Q4 use Table: employee (EmpID, ManagerID, JoinDate, Dept, Salary)
Q2: Find the nth highest salary from the Employee table.
Q3: You have an employee table with employee ID and manager ID. Find all employees under a specific manager, including their subordinates at any level.
Q4: Write a query to find the cumulative salary of employees department-wise, who have joined the company in the last 30 days.
Q5: Find the top 2 customers with the highest order amount for each product category, handling ties appropriately. Table: Customer (CustomerID, ProductCategory, OrderAmount)
👉Behavioral:
Q1: Why do you want to become a data analyst and why did you apply to this company?
Q2: Describe a time when you had to manage a difficult task with tight deadlines. How did you handle it?
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
👍4❤1
Excel interview questions for both data analysts and business analysts
1) What are the basic functions of Microsoft Excel?
2) Explain the difference between a workbook and a worksheet.
3) How would you freeze panes in Excel?
4) Can you name some common keyboard shortcuts in Excel?
5) What is the purpose of VLOOKUP and HLOOKUP?
7) How do you remove duplicate values in Excel?
8) Explain the steps to filter data in Excel.
9) What is the significance of the "IF" function in Excel, and can you provide an example of its use?
10) How would you create a pivot table in Excel?
11) Explain the use of the CONCATENATE function in Excel.
12) How do you create a chart in Excel?
13) Explain the difference between a line chart and a scatter plot.
14) What is conditional formatting, and how can it be applied in Excel?
15) How would you create a dynamic chart that updates with new data?
16) What is the INDEX-MATCH function, and how is it different from VLOOKUP?
17) Can you explain the concept of "PivotTables" and when you would use them?
18) How do you use the "COUNTIF" and "SUMIF" functions in Excel?
19) Explain the purpose of the "What-If Analysis" tools in Excel.
20) What are array formulas, and can you provide an example of their use?
Business Analysis Specific:
1) How would you analyze a set of sales data to identify trends and insights?
2) Explain how you might use Excel to perform financial modeling.
3) What Excel features would you use for forecasting and budgeting?
4) How do you handle large datasets in Excel, and what tools or techniques do you use for optimization?
5) What are some common techniques for cleaning and validating data in Excel?
6) How do you identify and handle errors in a dataset using Excel?
Scenario-based Questions:
1) Imagine you have a dataset with missing values. How would you approach this problem in Excel?
2) You are given a dataset with multiple sheets. How would you consolidate the data for analysis?
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
1) What are the basic functions of Microsoft Excel?
2) Explain the difference between a workbook and a worksheet.
3) How would you freeze panes in Excel?
4) Can you name some common keyboard shortcuts in Excel?
5) What is the purpose of VLOOKUP and HLOOKUP?
7) How do you remove duplicate values in Excel?
8) Explain the steps to filter data in Excel.
9) What is the significance of the "IF" function in Excel, and can you provide an example of its use?
10) How would you create a pivot table in Excel?
11) Explain the use of the CONCATENATE function in Excel.
12) How do you create a chart in Excel?
13) Explain the difference between a line chart and a scatter plot.
14) What is conditional formatting, and how can it be applied in Excel?
15) How would you create a dynamic chart that updates with new data?
16) What is the INDEX-MATCH function, and how is it different from VLOOKUP?
17) Can you explain the concept of "PivotTables" and when you would use them?
18) How do you use the "COUNTIF" and "SUMIF" functions in Excel?
19) Explain the purpose of the "What-If Analysis" tools in Excel.
20) What are array formulas, and can you provide an example of their use?
Business Analysis Specific:
1) How would you analyze a set of sales data to identify trends and insights?
2) Explain how you might use Excel to perform financial modeling.
3) What Excel features would you use for forecasting and budgeting?
4) How do you handle large datasets in Excel, and what tools or techniques do you use for optimization?
5) What are some common techniques for cleaning and validating data in Excel?
6) How do you identify and handle errors in a dataset using Excel?
Scenario-based Questions:
1) Imagine you have a dataset with missing values. How would you approach this problem in Excel?
2) You are given a dataset with multiple sheets. How would you consolidate the data for analysis?
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
👍4
🚨Here is a comprehensive list of #interview questions that are commonly asked in job interviews for Data Scientist, Data Analyst, and Data Engineer positions:
➡️ Data Scientist Interview Questions
Technical Questions
1) What are your preferred programming languages for data science, and why?
2) Can you write a Python noscript to perform data cleaning on a given dataset?
3) Explain the Central Limit Theorem.
4) How do you handle missing data in a dataset?
5) Describe the difference between supervised and unsupervised learning.
6) How do you select the right algorithm for your model?
Questions Related To Problem-Solving and Projects
7) Walk me through a data science project you have worked on.
8) How did you handle data preprocessing in your project?
9) How do you evaluate the performance of a machine learning model?
10) What techniques do you use to prevent overfitting?
➡️Data Analyst Interview Questions
Technical Questions
1) Write a SQL query to find the second highest salary from the employee table.
2) How would you optimize a slow-running query?
3) How do you use pivot tables in Excel?
4) Explain the VLOOKUP function.
5) How do you handle outliers in your data?
6) Describe the steps you take to clean a dataset.
Analytical Questions
7) How do you interpret data to make business decisions?
8) Give an example of a time when your analysis directly influenced a business decision.
9) What are your preferred tools for data analysis and why?
10) How do you ensure the accuracy of your analysis?
➡️Data Engineer Interview Questions
Technical Questions
1) What is your experience with SQL and NoSQL databases?
2) How do you design a scalable database architecture?
3) Explain the ETL process you follow in your projects.
4) How do you handle data transformation and loading efficiently?
5) What is your experience with Hadoop/Spark?
6) How do you manage and process large datasets?
Questions Related To Problem-Solving and Optimization
7) Describe a data pipeline you have built.
8) What challenges did you face, and how did you overcome them?
9) How do you ensure your data processes run efficiently?
10) Describe a time when you had to optimize a slow data pipeline.
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
➡️ Data Scientist Interview Questions
Technical Questions
1) What are your preferred programming languages for data science, and why?
2) Can you write a Python noscript to perform data cleaning on a given dataset?
3) Explain the Central Limit Theorem.
4) How do you handle missing data in a dataset?
5) Describe the difference between supervised and unsupervised learning.
6) How do you select the right algorithm for your model?
Questions Related To Problem-Solving and Projects
7) Walk me through a data science project you have worked on.
8) How did you handle data preprocessing in your project?
9) How do you evaluate the performance of a machine learning model?
10) What techniques do you use to prevent overfitting?
➡️Data Analyst Interview Questions
Technical Questions
1) Write a SQL query to find the second highest salary from the employee table.
2) How would you optimize a slow-running query?
3) How do you use pivot tables in Excel?
4) Explain the VLOOKUP function.
5) How do you handle outliers in your data?
6) Describe the steps you take to clean a dataset.
Analytical Questions
7) How do you interpret data to make business decisions?
8) Give an example of a time when your analysis directly influenced a business decision.
9) What are your preferred tools for data analysis and why?
10) How do you ensure the accuracy of your analysis?
➡️Data Engineer Interview Questions
Technical Questions
1) What is your experience with SQL and NoSQL databases?
2) How do you design a scalable database architecture?
3) Explain the ETL process you follow in your projects.
4) How do you handle data transformation and loading efficiently?
5) What is your experience with Hadoop/Spark?
6) How do you manage and process large datasets?
Questions Related To Problem-Solving and Optimization
7) Describe a data pipeline you have built.
8) What challenges did you face, and how did you overcome them?
9) How do you ensure your data processes run efficiently?
10) Describe a time when you had to optimize a slow data pipeline.
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
❤1
A - Always check your assumptions
B - Backup your data
C - Check your code
D - Do you know your data?
E - Evaluate your results
F - Find the anomalies
G - Get help when you need it
H - Have a backup plan
I - Investigate your outliers
J - Justify your methods
K - Keep your data clean
L - Let your data tell a story
M - Make your visualizations impactful
N - No one knows everything
O - Outline your analysis
P - Practice good documentation
Q - Quality control is key
R - Review your work
S - Stay organized
T - Test your assumptions
U - Use the right tools
V - Verify your results
W - Write clear and concise reports
X - Xamine for gaps in data
Y - Yield to the evidence
Z - Zero in on your findings
If you can master the ABCs of data analysis, you will be well on your way to being a successful Data Analyst.
B - Backup your data
C - Check your code
D - Do you know your data?
E - Evaluate your results
F - Find the anomalies
G - Get help when you need it
H - Have a backup plan
I - Investigate your outliers
J - Justify your methods
K - Keep your data clean
L - Let your data tell a story
M - Make your visualizations impactful
N - No one knows everything
O - Outline your analysis
P - Practice good documentation
Q - Quality control is key
R - Review your work
S - Stay organized
T - Test your assumptions
U - Use the right tools
V - Verify your results
W - Write clear and concise reports
X - Xamine for gaps in data
Y - Yield to the evidence
Z - Zero in on your findings
If you can master the ABCs of data analysis, you will be well on your way to being a successful Data Analyst.
👍4❤1
𝐇𝐨𝐰 𝐭𝐨 𝐏𝐫𝐞𝐩𝐚𝐫𝐞 𝐭𝐨 𝐁𝐞𝐜𝐨𝐦𝐞 𝐚 𝐃𝐚𝐭𝐚 𝐀𝐧𝐚𝐥𝐲𝐬𝐭
𝟏. 𝐄𝐱𝐜𝐞𝐥- Learn formulas, Pivot tables, Lookup, VBA Macros.
𝟐. 𝐒𝐐𝐋- Joins, Windows, CTE is the most important
𝟑. 𝐏𝐨𝐰𝐞𝐫 𝐁𝐈- Power Query Editor(PQE), DAX, MCode, RLS
𝟒. 𝐏𝐲𝐭𝐡𝐨𝐧- Basics & Libraries(mainly pandas, numpy, matplotlib and seaborn libraries)
5. Practice SQL and Python questions on platforms like 𝐇𝐚𝐜𝐤𝐞𝐫𝐑𝐚𝐧𝐤 or 𝐖𝟑𝐒𝐜𝐡𝐨𝐨𝐥𝐬.
6. Know the basics of denoscriptive statistics(mean, median, mode, Probability, normal, binomial, Poisson distributions etc).
7. Learn to use 𝐀𝐈/𝐂𝐨𝐩𝐢𝐥𝐨𝐭 𝐭𝐨𝐨𝐥𝐬 like GitHub Copilot or Power BI's AI features to automate tasks, generate insights, and improve your projects(Most demanding in Companies now)
8. Get hands-on experience with one cloud platform: 𝐀𝐳𝐮𝐫𝐞, 𝐀𝐖𝐒, 𝐨𝐫 𝐆𝐂𝐏
9. Work on at least two end-to-end projects.
10. Prepare an ATS-friendly resume and start applying for jobs.
11. Prepare for interviews by going through common interview questions on Google and YouTube.
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
𝟏. 𝐄𝐱𝐜𝐞𝐥- Learn formulas, Pivot tables, Lookup, VBA Macros.
𝟐. 𝐒𝐐𝐋- Joins, Windows, CTE is the most important
𝟑. 𝐏𝐨𝐰𝐞𝐫 𝐁𝐈- Power Query Editor(PQE), DAX, MCode, RLS
𝟒. 𝐏𝐲𝐭𝐡𝐨𝐧- Basics & Libraries(mainly pandas, numpy, matplotlib and seaborn libraries)
5. Practice SQL and Python questions on platforms like 𝐇𝐚𝐜𝐤𝐞𝐫𝐑𝐚𝐧𝐤 or 𝐖𝟑𝐒𝐜𝐡𝐨𝐨𝐥𝐬.
6. Know the basics of denoscriptive statistics(mean, median, mode, Probability, normal, binomial, Poisson distributions etc).
7. Learn to use 𝐀𝐈/𝐂𝐨𝐩𝐢𝐥𝐨𝐭 𝐭𝐨𝐨𝐥𝐬 like GitHub Copilot or Power BI's AI features to automate tasks, generate insights, and improve your projects(Most demanding in Companies now)
8. Get hands-on experience with one cloud platform: 𝐀𝐳𝐮𝐫𝐞, 𝐀𝐖𝐒, 𝐨𝐫 𝐆𝐂𝐏
9. Work on at least two end-to-end projects.
10. Prepare an ATS-friendly resume and start applying for jobs.
11. Prepare for interviews by going through common interview questions on Google and YouTube.
I have curated best 80+ top-notch Data Analytics Resources 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02
Hope this helps you 😊
❤2👍2