Data Analytics – Telegram
Data Analytics
108K subscribers
126 photos
2 files
791 links
Perfect channel to learn Data Analytics

Learn SQL, Python, Alteryx, Tableau, Power BI and many more

For Promotions: @coderfun @love_data
Download Telegram
You don't need to know everything about every data tool. Focus on what will help land you your job.

For Excel:
- IFS (all variations)
- XLOOKUP
- IMPORTRANGE (in GSheets)
- Pivot Tables
- Dynamic functions like TODAY()

For SQL:
- Sum
- Group By
- Window Functions
- CTEs
- Joins

For Tableau:
- Calculated Columns
- Sets
- Groups
- Formatting

For Power BI:
- Power Query for data transformation
- DAX (Data Analysis Expressions) for creating custom calculations
- Relationships between tables
- Creating interactive and dynamic dashboards
- Utilizing slicers and filters effectively

You can now find Data Analytics Resources on WhatsApp 👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02

Hope this helps you 😊
15👍7👏4🔥2🥰2
If you have time to learn...!

You have time to clean...!

Start from Scratch that !!!!

You have time to become a Data Analyst...!!


➜ learn Excel
➜ learn SQL
➜ learn either Power BI or Tableau
➜ learn what the heck ATS is and how to get around it
➜ learn to be ready for any interview question
➜ to build projects for a portfolio
➜ to put invest the time for your future
➜ to fail and pick yourself back up


And you don't need to do it all at once!

You can now find Data Analytics Resources on WhatsApp as well
👇👇
https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02

Hope this helps you 😊
11👍9
Data Analytics
Day 29: Query Performance Tuning – Optimize SQL Queries 1. Why Optimize SQL Queries? Efficient queries reduce execution time, improve database performance, and minimize resource usage. 2. Key Techniques for Query Optimization 📌 1. Use Indexes to Speed…
Day 30: Final Review & SQL Projects

1. Recap of Key Topics

Week 1 (SQL Basics): SELECT, WHERE, ORDER BY, Aggregations, GROUP BY
Week 2 (Intermediate SQL): JOINS, Subqueries, String & Date Functions, UNION
Week 3 (Advanced SQL): CTEs, Window Functions, Transactions, Indexing
Week 4 (Database Management): Constraints, Performance Tuning, Stored Procedures

2. SQL Mini Projects (Hands-On Practice)

📌 Project 1: Employee Database Analysis


👉 Skills Used: Joins, Aggregations, Window Functions
🔹 Find the top 5 highest-paid employees in each department.
🔹 Calculate the average salary per department using GROUP BY.

Example Query:

SELECT DepartmentID, Name, Salary, 
RANK() OVER(PARTITION BY DepartmentID ORDER BY Salary DESC) AS SalaryRank
FROM Employees;


📌 Project 2: E-Commerce Sales Insights
👉 Skills Used: Joins, Date Functions, Subqueries
🔹 Find the total revenue generated in the last 6 months.
🔹 Identify the top-selling products.

Example Query:

SELECT ProductID, SUM(TotalAmount) AS TotalSales 
FROM Orders
WHERE OrderDate >= DATEADD(MONTH, -6, GETDATE())
GROUP BY ProductID
ORDER BY TotalSales DESC;


📌 Project 3: Customer Retention Analysis
👉 Skills Used: CTEs, Window Functions, Recursive Queries
🔹 Identify customers who made repeat purchases.
🔹 Find the time gap between first and last purchase.

Example Query:

WITH CustomerOrders AS (
SELECT CustomerID, OrderDate,
RANK() OVER (PARTITION BY CustomerID ORDER BY OrderDate ASC) AS FirstOrder
FROM Orders
)
SELECT CustomerID, MIN(OrderDate) AS FirstPurchase, MAX(OrderDate) AS LastPurchase
FROM CustomerOrders
GROUP BY CustomerID;


3. What’s Next?

🚀 Continue Improving: Solve problems on LeetCode, StrataScratch, SQLZoo & 365datascience
📈 Build Projects: Create a portfolio with real-world datasets
📚 Learn Advanced Topics: Explore Data Warehousing, BigQuery, NoSQL

🎉 Congratulations on Completing the 30-Day SQL Challenge! 🎉

If you found this useful, like this post and share it with your friends!

Here you can find SQL Interview Resources👇
https://news.1rj.ru/str/sqlanalyst

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
16👍12
Data Analyst Roadmap:

- Tier 1: Learn Excel & SQL
- Tier 2: Data Cleaning & Exploratory Data Analysis (EDA)
- Tier 3: Data Visualization & Business Intelligence (BI) Tools
- Tier 4: Statistical Analysis & Machine Learning Basics

Then build projects that include:

- Data Collection
- Data Cleaning
- Data Analysis
- Data Visualization

And if you want to make your portfolio stand out more:

- Solve real business problems
- Provide clear, impactful insights
- Create a presentation
- Record a video presentation
- Target specific industries
- Reach out to companies

Hope this helps you 😊
👍118
Data Analytics
Data analyst interview questions Part-5: 21. What are Window Functions in SQL? Window functions perform calculations across a set of rows related to the current row without collapsing the dataset like GROUP BY. Common window functions: RANK() – Assigns…
Data Analyst Interview Series Part-6

What is the difference between COUNT(), COUNT(*), and COUNT(DISTINCT) in SQL?

COUNT(column_name): Counts non-null values in a specific column.
COUNT(*): Counts all rows, including NULL values.
COUNT(DISTINCT column_name): Counts unique non-null values in a column.

Example:

SELECT COUNT(salary) FROM employees; -- Counts non-null salaries SELECT COUNT(*) FROM employees; -- Counts all rows SELECT COUNT(DISTINCT department) FROM employees; -- Counts unique departments 


What are the different types of filters in Power BI?

Power BI provides several types of filters:

Visual-level filters: Apply to a single visual.
Page-level filters: Apply to all visuals on a report page.
Report-level filters: Apply to the entire report.
Drillthrough filters: Allow focusing on specific details by navigating to another report page.
Top N filters: Show only the top N values based on a measure.
Example: Using a Top N filter to show the top 5 performing products in sales.

How do you use the VLOOKUP function in Excel?

VLOOKUP searches for a value in the first column of a range and returns a corresponding value from another column.

Syntax:
=VLOOKUP(lookup_value, table_array, col_index_num, [range_lookup])

Example: To find an employee’s department based on their ID:
=VLOOKUP(101, A2:C10, 2, FALSE)

101 → Value to search for
A2:C10 → Table range
2 → Column number to return data from
FALSE → Exact match

What is the difference between a Bar Chart and a Column Chart?

Bar Chart: Uses horizontal bars, suitable for comparing categories.
Column Chart: Uses vertical bars, good for showing trends over time.
Example:
A Bar Chart is useful for comparing sales across regions.
A Column Chart is useful for showing monthly revenue growth.

How do you handle missing data in Pandas?

Pandas provides multiple ways to handle missing data:

Remove missing values: df.dropna()

Fill missing values with a default value: df.fillna(0)

Fill missing values with the column mean: df['Salary'].fillna(df['Salary'].mean(), inplace=True)

Forward fill (copy previous value): df.fillna(method='ffill')

Backward fill (copy next value): df.fillna(method='bfill')

These methods ensure data quality while preventing errors in analysis.

Like this post for if you want me to continue the interview series 👍♥️

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
👍157
As a data analytics enthusiast, the end goal is not just to learn SQL, Power BI, Python, Excel, etc. but to get a job as a Data Analyst👨💻

Back then, when I was trying to switch my career into data analytics, I used to keep aside 1:00-1:30 hours of my day aside so that I can utilize those hours to search for job openings related to Data analytics and Business Intelligence.

Before going to bed, I used to utilize the first 30 minutes by going through various job portals such as naukri, LinkedIn, etc to find relevant openings and next 1 hour by collecting the keywords from the job denoscription to curate the resume accordingly and searching for profile of people who can refer me for the role.

📍 I will advise every aspiring data analyst to have a dedicated timing for searching and applying for the jobs.

📍To get into data analytics, applying for jobs is as important as learning and upskilling.

If you are not applying for the jobs, you are simply delaying your success to get into data analytics👨💻📊

Hope this helps you 😊
👍208👎1
Data Analytics
Data Analyst Interview Series Part-6 What is the difference between COUNT(), COUNT(*), and COUNT(DISTINCT) in SQL? COUNT(column_name): Counts non-null values in a specific column. COUNT(*): Counts all rows, including NULL values. COUNT(DISTINCT column_name):…
Data Analyst Interview Part-7

What is the difference between INNER JOIN, LEFT JOIN, RIGHT JOIN, and FULL JOIN in SQL?

INNER JOIN
: Returns only matching rows from both tables.
LEFT JOIN: Returns all rows from the left table and matching rows from the right table. If no match, NULL is returned.
RIGHT JOIN: Returns all rows from the right table and matching rows from the left table. If no match, NULL is returned.
FULL JOIN: Returns all rows from both tables, filling unmatched values with NULL.

Example:

SELECT employees.name, departments.department_name FROM employees INNER JOIN departments ON employees.department_id = departments.id; 


What are Measures and Calculated Columns in Power BI?

Measures: Used for dynamic calculations based on user interactions. They are created using DAX and are not stored in tables.
Calculated Columns: Are static columns created using DAX, stored in a table, and calculated when the data is loaded.

Example of a Measure (Total Sales):
Total Sales = SUM(Sales[Amount])

Example of a Calculated Column (Profit Margin):
Profit Margin = Sales[Profit] / Sales[Revenue]


How do you remove duplicate values in Excel?

To remove duplicates, use the Remove Duplicates feature:
Select the data range.
Click Data > Remove Duplicates.
Choose the columns to check for duplicates.
Click OK.

Alternatively, use a formula to highlight duplicates:
=COUNTIF(A:A, A2) > 1
For Power Query:

Load data into Power Query.
Select columns.
Click Remove Duplicates.


What is a Heatmap in Data Visualization?

A heatmap is a graphical representation where values are represented by colors. It is used to visualize density, intensity, or correlation between variables.

Common use cases:
Website click heatmaps to analyze user behavior.

Correlation heatmaps in data science to show relationships between variables.

In Python, create a heatmap using Seaborn:

import seaborn as sns import matplotlib.pyplot as plt sns.heatmap(df.corr(), annot=True, cmap="coolwarm") plt.show() 


What is the difference between APPLY and MAP functions in Pandas?

map(): Used for element-wise transformations on a single Pandas Series.
apply(): Used for applying functions to a Series or an entire DataFrame.

Example using map():
df['Salary'] = df['Salary'].map(lambda x: x * 1.1) # Increases salary by 10%

Example using apply():
df['New Salary'] = df['Salary'].apply(lambda x: x * 1.1)

For DataFrames, apply() can work on rows or columns:
df.apply(lambda x: x.max() - x.min(), axis=1) # Row-wise difference

Like this post for if you want me to continue the interview series 👍♥️

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
7👍7
Data Analyst Learning Plan in 2025

|-- Week 1: Introduction to Data Analysis
|   |-- Data Analysis Fundamentals
|   |   |-- What is Data Analysis?
|   |   |-- Types of Data Analysis
|   |   |-- Data Analysis Workflow
|   |-- Tools and Environment Setup
|   |   |-- Overview of Tools (Excel, SQL)
|   |   |-- Installing Necessary Software
|   |   |-- Setting Up Your Workspace
|   |-- First Data Analysis Project
|   |   |-- Data Collection
|   |   |-- Data Cleaning
|   |   |-- Basic Data Exploration
|
|-- Week 2: Data Collection and Cleaning
|   |-- Data Collection Methods
|   |   |-- Primary vs. Secondary Data
|   |   |-- Web Scraping
|   |   |-- APIs
|   |-- Data Cleaning Techniques
|   |   |-- Handling Missing Values
|   |   |-- Data Transformation
|   |   |-- Data Normalization
|   |-- Data Quality
|   |   |-- Ensuring Data Accuracy
|   |   |-- Data Integrity
|   |   |-- Data Validation
|
|-- Week 3: Data Exploration and Visualization
|   |-- Exploratory Data Analysis (EDA)
|   |   |-- Denoscriptive Statistics
|   |   |-- Data Distribution
|   |   |-- Correlation Analysis
|   |-- Data Visualization Basics
|   |   |-- Choosing the Right Chart Type
|   |   |-- Creating Basic Charts
|   |   |-- Customizing Visuals
|   |-- Advanced Data Visualization
|   |   |-- Interactive Dashboards
|   |   |-- Storytelling with Data
|   |   |-- Data Presentation Techniques
|
|-- Week 4: Statistical Analysis
|   |-- Introduction to Statistics
|   |   |-- Denoscriptive vs. Inferential Statistics
|   |   |-- Probability Theory
|   |-- Hypothesis Testing
|   |   |-- Null and Alternative Hypotheses
|   |   |-- t-tests, Chi-square tests
|   |   |-- p-values and Significance Levels
|   |-- Regression Analysis
|   |   |-- Simple Linear Regression
|   |   |-- Multiple Linear Regression
|   |   |-- Logistic Regression
|
|-- Week 5: SQL for Data Analysis
|   |-- SQL Basics
|   |   |-- SQL Syntax
|   |   |-- Select, Insert, Update, Delete
|   |-- Advanced SQL
|   |   |-- Joins and Subqueries
|   |   |-- Window Functions
|   |   |-- Stored Procedures
|   |-- SQL for Data Analysis
|   |   |-- Data Aggregation
|   |   |-- Data Transformation
|   |   |-- SQL for Reporting
|
|-- Week 6-8: Python for Data Analysis
|   |-- Python Basics
|   |   |-- Python Syntax
|   |   |-- Data Types and Structures
|   |   |-- Functions and Loops
|   |-- Data Analysis with Python
|   |   |-- NumPy for Numerical Data
|   |   |-- Pandas for Data Manipulation
|   |   |-- Matplotlib and Seaborn for Visualization
|   |-- Advanced Data Analysis in Python
|   |   |-- Time Series Analysis
|   |   |-- Machine Learning Basics
|   |   |-- Data Pipelines
|
|-- Week 9-11: Real-world Applications and Projects
|   |-- Capstone Project
|   |   |-- Project Planning
|   |   |-- Data Collection and Preparation
|   |   |-- Building and Optimizing Models
|   |   |-- Creating and Publishing Reports
|   |-- Case Studies
|   |   |-- Business Use Cases
|   |   |-- Industry-specific Solutions
|   |-- Integration with Other Tools
|   |   |-- Data Analysis with Excel
|   |   |-- Data Analysis with R
|   |   |-- Data Analysis with Tableau/Power BI
|
|-- Week 12: Post-Project Learning
|   |-- Data Analysis for Business Intelligence
|   |   |-- KPI Dashboards
|   |   |-- Financial Reporting
|   |   |-- Sales and Marketing Analytics
|   |-- Advanced Data Analysis Topics
|   |   |-- Big Data Technologies
|   |   |-- Cloud Data Warehousing
|   |-- Continuing Education
|   |   |-- Advanced Data Analysis Techniques
|   |   |-- Community and Forums
|   |   |-- Keeping Up with Updates
|
|-- Resources and Community
|   |-- Online Courses (edX, Udemy)
|   |-- Data Analysis Blogs
|   |-- Data Analysis Communities

Follow the Data Analysts - SQL, Tableau, Excel, Power BI & Python channel on WhatsApp: https://whatsapp.com/channel/0029VaGgzAk72WTmQFERKh02

Like this post for more content like this 👍♥️

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
20👍7🔥4👏4
Tableau Learning Plan in 2025

|-- Week 1: Introduction to Tableau
|   |-- Tableau Basics
|   |   |-- What is Tableau?
|   |   |-- Tableau Products Overview (Desktop, Public, Online, Server)
|   |   |-- Installing Tableau Desktop
|   |-- Setting up Tableau Environment
|   |   |-- Connecting to Data Sources
|   |   |-- Overview of the Tableau Interface
|   |   |-- Basic Operations (Open, Save, Close)
|   |-- First Tableau Dashboard
|   |   |-- Creating a Simple Dashboard
|   |   |-- Basic Charts and Visualizations
|   |   |-- Adding Filters and Actions
|
|-- Week 2: Data Preparation and Transformation
|   |-- Data Connections
|   |   |-- Connecting to Various Data Sources (Excel, SQL, Web Data)
|   |   |-- Data Extracts vs. Live Connections
|   |-- Data Cleaning and Shaping
|   |   |-- Data Interpreter
|   |   |-- Pivot and Unpivot Data
|   |   |-- Handling Null Values
|   |-- Data Blending and Joins
|   |   |-- Data Blending
|   |   |-- Joins and Relationships
|   |   |-- Union Data
|
|-- Week 3: Intermediate Tableau
|   |-- Advanced Calculations
|   |   |-- Calculated Fields
|   |   |-- Table Calculations
|   |   |-- Level of Detail (LOD) Expressions
|   |-- Advanced Visualizations
|   |   |-- Dual-Axis Charts
|   |   |-- Heat Maps and Highlight Tables
|   |   |-- Custom Geocoding
|   |-- Dashboard Interactivity
|   |   |-- Filters and Parameters
|   |   |-- Dashboard Actions
|   |   |-- Using Stories for Narrative
|
|-- Week 4: Data Visualization Best Practices
|   |-- Design Principles
|   |   |-- Choosing the Right Chart Type
|   |   |-- Color Theory
|   |   |-- Layout and Formatting
|   |-- Advanced Mapping
|   |   |-- Creating and Customizing Maps
|   |   |-- Using Map Layers
|   |   |-- Geographic Data Visualization
|   |-- Performance Optimization
|   |   |-- Optimizing Data Sources
|   |   |-- Reducing Load Times
|   |   |-- Extracts and Aggregations
|
|-- Week 5: Tableau for Business Intelligence
|   |-- Business Dashboards
|   |   |-- KPI Dashboards
|   |   |-- Sales and Revenue Dashboards
|   |   |-- Financial Dashboards
|   |-- Storytelling with Data
|   |   |-- Creating Data Stories
|   |   |-- Using Annotations
|   |   |-- Interactive Dashboards
|   |-- Sharing and Collaboration
|   |   |-- Publishing to Tableau Server/Public
|   |   |-- Tableau Online Collaboration
|   |   |-- Embedding Dashboards in Websites
|
|-- Week 6-8: Advanced Tableau Techniques
|   |-- Tableau Prep
|   |   |-- Data Preparation Workflows
|   |   |-- Cleaning and Shaping Data with Tableau Prep
|   |   |-- Combining Data from Multiple Sources
|   |-- Tableau and Scripting
|   |   |-- Using R and Python in Tableau
|   |   |-- Advanced Analytics with Scripting
|   |-- Advanced Analytics
|   |   |-- Forecasting
|   |   |-- Clustering
|   |   |-- Trend Lines
|   |-- Tableau Extensions
|   |   |-- Installing and Using Extensions
|   |   |-- Popular Extensions Overview
|
|-- Week 9-11: Real-world Applications and Projects
|   |-- Capstone Project
|   |   |-- Project Planning
|   |   |-- Data Collection and Preparation
|   |   |-- Building and Optimizing Dashboards
|   |   |-- Creating and Publishing Reports
|   |-- Case Studies
|   |   |-- Business Use Cases
|   |   |-- Industry-specific Solutions
|   |-- Integration with Other Tools
|   |   |-- Tableau and SQL
|   |   |-- Tableau and Excel
|   |   |-- Tableau and Power BI
|
|-- Week 12: Post-Project Learning
|   |-- Tableau Administration
|   |   |-- Managing Tableau Server
|   |   |-- User Roles and Permissions
|   |   |-- Monitoring and Auditing
|   |-- Advanced Tableau Topics
|   |   |-- New Tableau Features
|   |   |-- Latest Tableau Techniques
|   |   |-- Community and Forums
|   |   |-- Keeping Up with Updates
|
|-- Resources and Community
|   |-- Online Courses (Tableau Official)
|   |-- Tableau Blogs and Podcasts
|   |-- Tableau Communities

You can refer these Tableau Interview Resources to learn more: https://whatsapp.com/channel/0029VasYW1V5kg6z4EHOHG1t

Like this post for more resources ♥️

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
👍134
It takes time to learn Excel.

It takes time to master SQL.

It takes time to understand Power BI.

It takes time to analyze complex datasets.

It takes time to create impactful dashboards.

It takes time to work on real-world data projects.

It takes time to build a strong LinkedIn profile.

It takes time to prepare for technical and behavioral interviews.

Here’s one tip from someone who’s been through it all:

Be Patient. Good things take time ☺️

Keep building your skills and showcasing your value. Your time will come!
35👍17🔥2🥰2👎1
Data Analytics
Data Analyst Interview Part-7 What is the difference between INNER JOIN, LEFT JOIN, RIGHT JOIN, and FULL JOIN in SQL? INNER JOIN: Returns only matching rows from both tables. LEFT JOIN: Returns all rows from the left table and matching rows from the right…
Data Analyst Interview Part-8

How do you perform data cleaning in Python?

Data cleaning in Python involves several steps:

Handling missing data:
Drop missing values: df.dropna()
Fill missing values: df.fillna(value)

Removing duplicates:
df.drop_duplicates()

Converting data types:
df['column'] = df['column'].astype(int)

Handling outliers:
Use filtering or statistical methods to identify and remove outliers.

Standardizing or normalizing data:
Use libraries like scikit-learn for scaling:

from sklearn.preprocessing import StandardScaler scaler = StandardScaler() df[['column']] = scaler.fit_transform(df[['column']]) 


What is the use of GROUP BY in SQL?

GROUP BY is used to group rows that have the same values into summary rows, often with aggregate functions like COUNT, SUM, AVG, etc.

Example:

SELECT department, AVG(salary) FROM employees GROUP BY department; 

This will calculate the average salary for each department.

What is the significance of normalization in SQL?

Normalization is the process of organizing data in a way that reduces redundancy and dependency by dividing large tables into smaller ones and using relationships (foreign keys).

1st Normal Form (1NF): Ensures atomicity (no multi-valued fields).

2nd Normal Form (2NF): Ensures that all non-key attributes are fully dependent on the primary key.

3rd Normal Form (3NF): Ensures that no transitive dependencies exist (non-key attributes do not depend on other non-key attributes).

How do you handle time series data in Python?

Handling time series data in Python involves several steps:

Converting to DateTime format: df['date'] = pd.to_datetime(df['date'])

Resampling: To aggregate data at different frequencies:
df.set_index('date').resample('M').sum()

Decomposition: Split the time series into trend, seasonality, and residuals:

from statsmodels.tsa.seasonal
import seasonal_decompose decomposition = seasonal_decompose(df['value'], model='additive', period=12) decomposition.plot()


Plotting: Use libraries like Matplotlib and Seaborn to visualize trends over time.


What are the advantages of using Power BI over Excel?

Data Handling: Power BI can handle much larger datasets (millions of rows) compared to Excel.

Data Modeling: Power BI allows creating complex data models and relationships between tables, which is harder to manage in Excel.

Interactive Visualizations: Power BI offers interactive dashboards with drill-down capabilities.

Advanced Features: Power BI supports advanced analytics, DAX for custom calculations, and integration with other tools like Azure and SharePoint.

Scheduled Refresh: Power BI allows automatic data refresh from connected sources, while in Excel, this needs to be done manually.

Like this post for if you want me to continue the interview series 👍♥️

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
👍117
Recent Interview Question for Data Analyst Role

Question: You have two tables:

Employee:-
Columns: EID (Employee ID), ESalary (Employee Salary)

empdetails:-
Columns: EID (Employee ID), EDOB (Employee Date of Birth)

Your task is to:
1) Identify all employees whose salary (ESalary) is an odd number?
2) Retrieve the date of birth (EDOB) for these employees from the empdetails table.

How would you write a SQL query to achieve this?

SELECT e.EID, ed.EDOB
FROM (
SELECT EID
FROM Employee
WHERE ESalary % 2 <> 0
) e
JOIN empdetails ed ON e.EID = ed.EID;


Explanation of the query :-

Filter Employees with Odd Salaries:

The subquery SELECT EID FROM Employee WHERE ESalary % 2 <> 0 filters out Employee IDs (EID) where the salary (ESalary) is an odd number. The modulo operator % checks if ESalary divided by 2 leaves a remainder (<>0).

Merge with empdetails:

The main query then takes the filtered Employee IDs from the subquery and performs a join with the empdetails table using the EID column. This retrieves the date of birth (EDOB) for these employees.

Hope this helps you 😊
👍226
🔍 Best Data Analytics Roles Based on Your Graduation Background!

Thinking about a career in Data Analytics but unsure which role fits your background? Check out these top job roles based on your degree:

🚀 For Mathematics/Statistics Graduates:
🔹 Data Analyst
🔹 Statistical Analyst
🔹 Quantitative Analyst
🔹 Risk Analyst

🚀 For Computer Science/IT Graduates:
🔹 Data Scientist
🔹 Business Intelligence Developer
🔹 Data Engineer
🔹 Data Architect

🚀 For Economics/Finance Graduates:
🔹 Financial Analyst
🔹 Market Research Analyst
🔹 Economic Consultant
🔹 Data Journalist

🚀 For Business/Management Graduates:
🔹 Business Analyst
🔹 Operations Research Analyst
🔹 Marketing Analytics Manager
🔹 Supply Chain Analyst

🚀 For Engineering Graduates:
🔹 Data Scientist
🔹 Industrial Engineer
🔹 Operations Research Analyst
🔹 Quality Engineer

🚀 For Social Science Graduates:
🔹 Data Analyst
🔹 Research Assistant
🔹 Social Media Analyst
🔹 Public Health Analyst

🚀 For Biology/Healthcare Graduates:
🔹 Clinical Data Analyst
🔹 Biostatistician
🔹 Research Coordinator
🔹 Healthcare Consultant

Pro Tip:

Some of these roles may require additional certifications or upskilling in SQL, Python, Power BI, Tableau, or Machine Learning to stand out in the job market.

Like if it helps ❤️
👍144
Becoming a Data Analyst in 2025 is more difficult than it was a couple of years ago. The competition has grown but so has the demand for Data Analysts!

There are 5 areas you need to excel at to land a career in data. (so punny...)
1. Skills
2. Experience
3. Networking
4. Job Search
5. Education

Let's dive into the first and most important area, skills.

Skills
Every data analytics job will require a different set of skills for their job denoscription. To cover the majority of entry-level positions, you should focus on the core 3 (or 4 if you have time).
- Excel
- SQL
- Tableau or Power BI
- Python or R(optional)
No need to learn any more than this to get started. Start learning other skills AFTER you land your first job and see what data analytics path you really enjoy.
You might fall into a path that doesn't require Python at all and if you took 3 months to learn it, you wasted 3 months. Your goal should be to get your foot in the door.

Experience
So how do you show that you have experience if you have never worked as a Data Analyst professionally? 
It's actually easier than you think! 
There are a few ways you can gain experience. volunteer, freelance, or any analytics work at your current job.
First ask your friends, family, or even Reddit if anyone needs help with their data.
Second, you can join Upwork or Fiverr to land some freelance gigs to gain great experience and some extra money.
Thirdly, even if your noscript isn't "Data Analyst", you might analyze data anyway. Use this as experience!

Networking
I love this section the most. It has been proven by everyone I have mentored that this is one of the most important areas to learn.
Start talking to other Data Analysts, start connecting with the RIGHT people, start posting on LinkedIn, start following people in the field, and start commenting on posts.
All of this, over time, will continue to get "eyes" on your profile. This will lead to more calls, interviews, and like the people I teach, job offers. 
Consistency is important here.

Job Search
I believe this is not a skill and is more like a "numbers game". And the ones who excel here, are the ones who are consistent.
I'm not saying you need to apply all day every day but you should spend SOME time applying every day.
This is important because you don't know when exactly a company will be posting their job posting. You also want to be one of the first people to apply so that means you need to check the job boards in multiple small chunks rather than spend all of your time applying in a single chunk of time.
The best way to do this is to open up all of the filters and select the most recent and posted within the last 3 days. 

Education
If you have a degree or are currently on your way to getting one, this section doesn't really apply to you since you have a leg up on a lot more job opportunities.

So how else does someone show they are educated enough to become a Data Analyst?
You need to prove it by taking relevant courses in relation to the industry you want to enter. After the course, the actual certificate does not hold much weight unless it's an accredited certificate like a Tableau Professional Certificate. 

To counter this, you need to use your project denoscriptions to explain how you used data to solve a business problem and explain it professionally.

There are so many other areas you could work on but focussing on these to start will definitely get you going in the right direction. 

Take time to put these actions to work. Pivot when something isn't working and adapt.
It will take time but these actions will reduce the time it takes you to become a Data Analyst in 2025

Hope this helps you 😊
👍2310
Data Analytics
Data Analyst Interview Part-8 How do you perform data cleaning in Python? Data cleaning in Python involves several steps: Handling missing data: Drop missing values: df.dropna() Fill missing values: df.fillna(value) Removing duplicates: df.drop_duplicates()…
Data Analyst Interview Part-9

How do you perform joins in Power BI using relationships?

In Power BI, joins are handled through relationships between tables instead of traditional SQL joins. You can create relationships using the Model View, where you define one-to-one, one-to-many, or many-to-many relationships. Power BI automatically determines the best relationship based on column values, but you can modify the cardinality and cross-filter direction to control how data is connected across tables.


What are some common aggregate functions in Excel?

Aggregate functions summarize data in Excel. Common ones include:
SUM: Adds values in a range.
AVERAGE: Calculates the mean.
COUNT: Counts the number of non-empty cells.
MAX/MIN: Finds the highest and lowest values.
MEDIAN: Returns the middle value of a dataset.
STDEV: Measures data variation (Standard Deviation).
These functions are commonly used in financial analysis, data validation, and reporting.


What are DAX functions in Power BI, and why are they important?

DAX (Data Analysis Expressions) functions help create custom calculations and measures in Power BI. They are important because they allow users to perform dynamic aggregations, conditional calculations, and time-based analysis. Key categories include:
Aggregation Functions: SUM, AVERAGE, COUNT
Filter Functions: FILTER, CALCULATE
Time Intelligence Functions: DATEADD, SAMEPERIODLASTYEAR
Logical Functions: IF, SWITCH
DAX enables advanced reporting and helps build meaningful insights from raw data.


What is data normalization, and why is it important?

Data normalization is the process of organizing data in a database to reduce redundancy and improve data integrity. It ensures efficient storage and retrieval by dividing large tables into smaller, related tables and using foreign keys to maintain relationships.

Benefits of normalization include:

Eliminates duplicate data
Improves consistency and accuracy
Enhances database performance
Reduces data anomalies

Normalization is crucial in relational databases to maintain a clean and scalable data structure.


What are some common data visualization best practices?

Effective data visualization helps communicate insights clearly. Best practices include:

Choose the right chart (e.g., bar charts for comparisons, line charts for trends).
Keep it simple (avoid unnecessary elements like 3D effects).
Use colors wisely (highlight key insights without overloading with colors).
Ensure data accuracy (labels, scales, and values must be correct).
Use interactive elements (filters, drill-downs in Power BI/Tableau).
Provide context (noscripts, legends, and annotations to explain findings).

Well-designed visualizations improve decision-making and help stakeholders understand data easily.

Like this post for if you want me to continue the interview series 👍♥️

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
👍117🥰1
The key to starting your data analysis career:

It's not your education
It's not your experience

It's how you apply these principles:

1. Learn the job through "doing"
2. Build a portfolio
3. Make yourself known

No one starts an expert, but everyone can become one.

If you're looking for a career in data analysis, start by:

⟶ Watching videos
⟶ Reading experts advice
⟶ Doing internships
⟶ Building a portfolio
⟶ Learning from seniors

You'll be amazed at how fast you'll learn and how quickly you'll become an expert.

So, start today and let the data analysis career begin

Hope it helps :)
8👍7
What seperates a good 𝗗𝗮𝘁𝗮 𝗔𝗻𝗮𝗹𝘆𝘀𝘁 from a great one?

The journey to becoming an exceptional data analyst requires mastering a blend of technical and soft skills.

Technical skills:
- Querying Data with SQL
- Data Visualization (Tableau/PowerBI)
- Data Storytelling and Reporting
- Data Exploration and Analytics
- Data Modeling

Soft Skills:
- Problem Solving
- Communication
- Business Acumen
- Curiosity
- Critical Thinking
- Learning Mindset

But how do you develop these soft skills?

◆ Tackle real-world data projects or case studies. The more complex, the better.

◆ Practice explaining your analysis to non-technical audiences. If they understand, you’ve nailed it!

◆ Learn how industries use data for decision-making. Align your analysis with business outcomes.

◆ Stay curious, ask 'why,' and dig deeper into your data. Don’t settle for surface-level insights.

◆ Keep evolving. Attend webinars, read books, or engage with industry experts regularly.
👍105👌2
🚀 How to Land a Data Analyst Job Without Experience?

Many people asked me this question, so I thought to answer it here to help everyone. Here is the step-by-step approach i would recommend:

Step 1: Master the Essential Skills

You need to build a strong foundation in:

🔹 SQL – Learn how to extract and manipulate data
🔹 Excel – Master formulas, Pivot Tables, and dashboards
🔹 Python – Focus on Pandas, NumPy, and Matplotlib for data analysis
🔹 Power BI/Tableau – Learn to create interactive dashboards
🔹 Statistics & Business Acumen – Understand data trends and insights

Where to learn?
📌 Google Data Analytics Course
📌 SQL – Mode Analytics (Free)
📌 Python – Kaggle or DataCamp


Step 2: Work on Real-World Projects

Employers care more about what you can do rather than just your degree. Build 3-4 projects to showcase your skills.

🔹 Project Ideas:

Analyze sales data to find profitable products
Clean messy datasets using SQL or Python
Build an interactive Power BI dashboard
Predict customer churn using machine learning (optional)

Use Kaggle, Data.gov, or Google Dataset Search to find free datasets!


Step 3: Build an Impressive Portfolio

Once you have projects, showcase them! Create:
📌 A GitHub repository to store your SQL/Python code
📌 A Tableau or Power BI Public Profile for dashboards
📌 A Medium or LinkedIn post explaining your projects

A strong portfolio = More job opportunities! 💡


Step 4: Get Hands-On Experience

If you don’t have experience, create your own!
📌 Do freelance projects on Upwork/Fiverr
📌 Join an internship or volunteer for NGOs
📌 Participate in Kaggle competitions
📌 Contribute to open-source projects

Real-world practice > Theoretical knowledge!


Step 5: Optimize Your Resume & LinkedIn Profile

Your resume should highlight:
✔️ Skills (SQL, Python, Power BI, etc.)
✔️ Projects (Brief denoscriptions with links)
✔️ Certifications (Google Data Analytics, Coursera, etc.)

Bonus Tip:
🔹 Write "Data Analyst in Training" on LinkedIn
🔹 Start posting insights from your learning journey
🔹 Engage with recruiters & join LinkedIn groups


Step 6: Start Applying for Jobs

Don’t wait for the perfect job—start applying!
📌 Apply on LinkedIn, Indeed, and company websites
📌 Network with professionals in the industry
📌 Be ready for SQL & Excel assessments

Pro Tip: Even if you don’t meet 100% of the job requirements, apply anyway! Many companies are open to hiring self-taught analysts.

You don’t need a fancy degree to become a Data Analyst. Skills + Projects + Networking = Your job offer!

🔥 Your Challenge: Start your first project today and track your progress!

Share with credits: https://news.1rj.ru/str/sqlspecialist

Hope it helps :)
👍218👏2🔥1
𝟓 𝐖𝐚𝐲𝐬 𝐭𝐨 𝐀𝐩𝐩𝐥𝐲 𝐟𝐨𝐫 𝐃𝐚𝐭𝐚 𝐀𝐧𝐚𝐥𝐲𝐬𝐭 𝐉𝐨𝐛𝐬

🔸𝐔𝐬𝐞 𝐉𝐨𝐛 𝐏𝐨𝐫𝐭𝐚𝐥𝐬
Job boards like LinkedIn & Naukari are great portals to find jobs.

Set up job alerts using keywords like “Data Analyst” so you’ll get notified as soon as something new comes up.

🔸𝐓𝐚𝐢𝐥𝐨𝐫 𝐘𝐨𝐮𝐫 𝐑𝐞𝐬𝐮𝐦𝐞
Don’t send the same resume to every job.

Take time to highlight the skills and tools that the job denoscription asks for, like SQL, Power BI, or Excel. It helps your resume get noticed by software that scans for keywords (ATS).

🔸𝐔𝐬𝐞 𝐋𝐢𝐧𝐤𝐞𝐝𝐈𝐧
Connect with recruiters and employees from your target companies. Ask for referrals when any jib opening is poster

Engage with data-related content and share your own work (like project insights or dashboards).

🔸𝐂𝐡𝐞𝐜𝐤 𝐂𝐨𝐦𝐩𝐚𝐧𝐲 𝐖𝐞𝐛𝐬𝐢𝐭𝐞𝐬 𝐑𝐞𝐠𝐮𝐥𝐚𝐫𝐥𝐲
Most big companies post jobs directly on their websites first.

Create a list of companies you’re interested in and keep checking their careers page. It’s a good way to find openings early before they post on job portals.

🔸𝐅𝐨𝐥𝐥𝐨𝐰 𝐔𝐩 𝐀𝐟𝐭𝐞𝐫 𝐀𝐩𝐩𝐥𝐲𝐢𝐧𝐠
After applying to a job, it helps to follow up with a quick message on LinkedIn. You can send a polite note to recruiter and aks for the update on your candidature.
👍19
I see so many people jump into data analytics, excited by its popularity, only to feel lost or uninterested soon after. I get it, data isn’t for everyone, and that’s okay.

Data analytics requires a certain spark or say curiosity. You need that drive to dig deeper, to understand why things happen, to explore how data pieces connect to reveal a bigger picture. Without that spark, it’s easy to feel overwhelmed or even bored.

Before diving in, ask yourself, Do I really enjoy solving puzzles? Am I genuinely excited about numbers, patterns, and insights? If you’re curious and love learning, data can be incredibly rewarding. But if it’s just about following a trend, it might not be a fulfilling path for you.

Be honest with yourself. Find your passion, whether it’s in data or somewhere else and invest in something that truly excites you.

Hope this helps you 😊
👍3117👎1🔥1
As a data analytics enthusiast, the end goal is not just to learn SQL, Power BI, Python, Excel, etc. but to get a job as a Data Analyst👨💻

Back then, when I was trying to switch my career into data analytics, I used to keep aside 1:00-1:30 hours of my day aside so that I can utilize those hours to search for job openings related to Data analytics and Business Intelligence.

Before going to bed, I used to utilize the first 30 minutes by going through various job portals such as naukri, LinkedIn, etc to find relevant openings and next 1 hour by collecting the keywords from the job denoscription to curate the resume accordingly and searching for profile of people who can refer me for the role.

📍 I will advise every aspiring data analyst to have a dedicated timing for searching and applying for the jobs.

📍To get into data analytics, applying for jobs is as important as learning and upskilling.

If you are not applying for the jobs, you are simply delaying your success to get into data analytics👨💻📊

Data Analytics Resources
👇👇
https://news.1rj.ru/str/DataSimplifier

Hope this helps you 😊
👍18🥰4🎉21