Resolve user conflicts

feat/new-user
Kamran Ahmed 4 days ago
commit 426440c0f5
  1. 2
      .astro/settings.json
  2. 3
      .env.example
  3. 3
      license
  4. 3
      package.json
  5. 2690
      pnpm-lock.yaml
  6. 2
      public/roadmap-content/ai-engineer.json
  7. 19
      public/roadmap-content/computer-science.json
  8. 215
      public/roadmap-content/data-analyst.json
  9. 15
      public/roadmap-content/frontend.json
  10. 10
      public/roadmap-content/php.json
  11. 6
      public/roadmap-content/postgresql-dba.json
  12. 15
      public/roadmap-content/product-manager.json
  13. 27
      public/roadmap-content/python.json
  14. 2
      public/roadmap-content/software-architect.json
  15. 6
      public/roadmap-content/system-design.json
  16. 10
      public/roadmap-content/ux-design.json
  17. 28
      public/roadmap-content/vue.json
  18. 3
      src/components/AdvertiseForm.tsx
  19. 2
      src/components/AstroIcon.astro
  20. 143
      src/components/AuthenticationFlow/CourseLoginPopup.tsx
  21. 39
      src/components/AuthenticationFlow/GitHubButton.tsx
  22. 48
      src/components/AuthenticationFlow/GoogleButton.tsx
  23. 50
      src/components/AuthenticationFlow/LinkedInButton.tsx
  24. 38
      src/components/CreateTeam/RoadmapSelector.tsx
  25. 8
      src/components/Navigation/Navigation.astro
  26. 28
      src/components/ReactIcons/RoadmapLogo.tsx
  27. 2
      src/components/RoadmapDropdownMenu/RoadmapDropdownMenu.tsx
  28. 67
      src/components/SQLCourse/AccountButton.tsx
  29. 213
      src/components/SQLCourse/BuyButton.tsx
  30. 145
      src/components/SQLCourse/ChapterRow.tsx
  31. 24
      src/components/SQLCourse/CourseAuthor.tsx
  32. 94
      src/components/SQLCourse/CourseFeature.tsx
  33. 113
      src/components/SQLCourse/FAQSection.tsx
  34. 56
      src/components/SQLCourse/FloatingPurchase.tsx
  35. 415
      src/components/SQLCourse/SQLCoursePage.tsx
  36. 29
      src/components/SQLCourse/SectionHeader.tsx
  37. 57
      src/components/SQLCourse/Spotlight.tsx
  38. 1
      src/components/TopicDetail/TopicProgressButton.tsx
  39. 4
      src/components/UserProgress/UserProgressModalHeader.tsx
  40. 2
      src/data/guides/basics-of-authentication.md
  41. 235
      src/data/guides/devops-job-description.md
  42. 2
      src/data/guides/devops-principles.md
  43. 2
      src/data/guides/devops-shift-left-testing.md
  44. 10
      src/data/guides/devops-test-automation.md
  45. 204
      src/data/guides/devops-vs-agile.md
  46. 231
      src/data/guides/devops-vs-devsecops.md
  47. 2
      src/data/projects/weather-api-wrapper-service.md
  48. 2
      src/data/roadmaps/ai-engineer/ai-engineer.json
  49. 2
      src/data/roadmaps/ai-engineer/content/bias-and-fairness@lhIU0ulpvDAn1Xc3ooYz_.md
  50. 2
      src/data/roadmaps/backend/backend-beginner.json
  51. 4
      src/data/roadmaps/computer-science/content/dml@tcQSH-eAvJUZuePTDjAIb.md
  52. 2
      src/data/roadmaps/computer-science/content/how-computers-calculate@GDLKJkKgB-i7n0YcV2NDa.md
  53. 8
      src/data/roadmaps/computer-science/content/p--np@0btHNkzWL1w_-pUgU_k2y.md
  54. 2
      src/data/roadmaps/data-analyst/content/analaysis--reporting-with-excel@sgXIjVTbwdwdYoaxN3XBM.md
  55. 2
      src/data/roadmaps/data-analyst/content/apis@4DFcXSSHxg5wv0uXLIRij.md
  56. 4
      src/data/roadmaps/data-analyst/content/average@yn1sstYMO9du3rpfQqNs9.md
  57. 4
      src/data/roadmaps/data-analyst/content/bar-charts@EVk1H-QLtTlpG7lVEenDt.md
  58. 6
      src/data/roadmaps/data-analyst/content/big-data-technologies@_aUQZWUhFRvNu0MZ8CPit.md
  59. 2
      src/data/roadmaps/data-analyst/content/cleanup@nC7tViln4UyQFYP_-fyjB.md
  60. 6
      src/data/roadmaps/data-analyst/content/data-cleanup@E6cpb6kvluJM8OGuDcFBT.md
  61. 6
      src/data/roadmaps/data-analyst/content/data-collection@_sjXCLHHTbZromJYn6fnu.md
  62. 8
      src/data/roadmaps/data-analyst/content/data-manipulation-libraries@M1QtGTLyygIjePoCfvjve.md
  63. 4
      src/data/roadmaps/data-analyst/content/data-storage-solutions@iTmtpXe7dR4XKslgpsk2q.md
  64. 2
      src/data/roadmaps/data-analyst/content/data-transformation@t_BRtEharsrOZxoyX0OzV.md
  65. 6
      src/data/roadmaps/data-analyst/content/data-visualisation-libraries@l1SnPc4EMqGdaIAhIQfrT.md
  66. 6
      src/data/roadmaps/data-analyst/content/data-visualisation@2g19zjEASJw2ve57hxpr0.md
  67. 4
      src/data/roadmaps/data-analyst/content/databases@tYPeLCxbqvMFlTkCGjdHg.md
  68. 2
      src/data/roadmaps/data-analyst/content/datedif@yBlJrNo9eO470dLp6OaQZ.md
  69. 6
      src/data/roadmaps/data-analyst/content/deep-learning-optional@SiYUdtYMDImRPmV2_XPkH.md
  70. 2
      src/data/roadmaps/data-analyst/content/dplyr@v8TfY-b4W5ygOv7r-syHq.md
  71. 2
      src/data/roadmaps/data-analyst/content/dplyr@y__UHXe2DD-IB7bvMF1-X.md
  72. 4
      src/data/roadmaps/data-analyst/content/ggplot2@E0hIgQEeZlEidr4HtUFrL.md
  73. 4
      src/data/roadmaps/data-analyst/content/hadoop@wECWIRMlWNoTxz5eKwaSf.md
  74. 4
      src/data/roadmaps/data-analyst/content/heatmap@G8resQXEVEHCaQfDlt3nj.md
  75. 2
      src/data/roadmaps/data-analyst/content/image-recognition@bHPJ6yOHtUq5EjJBSrJUE.md
  76. 6
      src/data/roadmaps/data-analyst/content/matplotlib@tvDdXwaRPsUSTqJGaLS3P.md
  77. 4
      src/data/roadmaps/data-analyst/content/matplotlib@uGkXxdMXUMY-3fQFS1jK8.md
  78. 2
      src/data/roadmaps/data-analyst/content/mode@fY8zVG2tVbmtx5OhY7hj-.md
  79. 4
      src/data/roadmaps/data-analyst/content/model-evaluation-techniques@7ikA373qH88HBx5irCgIH.md
  80. 2
      src/data/roadmaps/data-analyst/content/neural-networks@gGHsKcS92StK5FolzmVvm.md
  81. 2
      src/data/roadmaps/data-analyst/content/pandas@8OXmF2Gn6TYJotBRvDjqA.md
  82. 2
      src/data/roadmaps/data-analyst/content/pandas@TucngXKNptbeo3PtdJHX8.md
  83. 6
      src/data/roadmaps/data-analyst/content/pie-charts@K9xwm_Vpdup9ujYqlD9F3.md
  84. 6
      src/data/roadmaps/data-analyst/content/pivot-tables@2DDJUFr0AJTVR2Whj8zub.md
  85. 2
      src/data/roadmaps/data-analyst/content/power-bi@SJLeose5vZU8w_18C8_t0.md
  86. 4
      src/data/roadmaps/data-analyst/content/predictive-analytics@3WZORRCwme3HsaKew23Z5.md
  87. 3
      src/data/roadmaps/data-analyst/content/pytorch@LJSqfz6aYJbCe_bK8EWI1.md
  88. 2
      src/data/roadmaps/data-analyst/content/range@tSxtyJhL5wjU0XJcjsJmm.md
  89. 4
      src/data/roadmaps/data-analyst/content/rnn@Gocm98_tRg5BGxKcP-7zg.md
  90. 2
      src/data/roadmaps/data-analyst/content/scatter-plot@A5YQv7D4qRcskdZ64XldH.md
  91. 2
      src/data/roadmaps/data-analyst/content/seaborn@-cJb8gEBvdVFf7FlgG3Ud.md
  92. 2
      src/data/roadmaps/data-analyst/content/spark@vaiigToDh4522rtWamuSM.md
  93. 2
      src/data/roadmaps/data-analyst/content/stacked-charts@329BrtmXjXNLfi1SFfdeo.md
  94. 5
      src/data/roadmaps/data-analyst/content/statistical-analysis@TeewVruErSsD4VLXcaDxp.md
  95. 2
      src/data/roadmaps/data-analyst/content/supervised-learning@FIYCkGXofKMsXmsqHSMh9.md
  96. 2
      src/data/roadmaps/data-analyst/content/tableau@Sz2Y8HLbSmDjSKAJztDql.md
  97. 3
      src/data/roadmaps/data-analyst/content/tensorflow@FJ4Sx477FWxyDsQr0R8rl.md
  98. 9
      src/data/roadmaps/data-analyst/content/types-of-data-analytics@Lsapbmg-eMIYJAHpV97nO.md
  99. 4
      src/data/roadmaps/data-analyst/content/unsupervised-learning@FntL9E2yVAYwIrlANDNKE.md
  100. 4
      src/data/roadmaps/data-analyst/content/variance@ict4JkoVM-AzPbp9bDztg.md
  101. Some files were not shown because too many files have changed in this diff Show More

@ -3,6 +3,6 @@
"enabled": false
},
"_variables": {
"lastUpdateCheck": 1737069970237
"lastUpdateCheck": 1737392387456
}
}

@ -1,3 +1,4 @@
PUBLIC_API_URL=https://api.roadmap.sh
PUBLIC_AVATAR_BASE_URL=https://dodrc8eu8m09s.cloudfront.net/avatars
PUBLIC_EDITOR_APP_URL=https://draw.roadmap.sh
PUBLIC_EDITOR_APP_URL=https://draw.roadmap.sh
PUBLIC_COURSE_APP_URL=http://localhost:5173

@ -1,6 +1,7 @@
Everything including text and images in this project are protected by the copyright laws.
You are allowed to use this material for personal use but are not allowed to use it for
any other purpose including publishing the images, the project files or the content in the images in any form either digital, non-digital, textual, graphical or written formats.
any other purpose including publishing the images, the project files or the content in
the images in any form either digital, non-digital, textual, graphical or written formats.
You are allowed to share the links to the repository or the website roadmap.sh but not
the content for any sort of usage that involves the content of this repository taken out
of the repository and be shared from any other medium including but not limited to blog

@ -67,10 +67,12 @@
"rehype-external-links": "^3.0.0",
"remark-parse": "^11.0.0",
"roadmap-renderer": "^1.0.6",
"sanitize-html": "^2.13.1",
"satori": "^0.11.2",
"satori-html": "^0.3.2",
"sharp": "^0.33.5",
"slugify": "^1.6.6",
"tiptap-markdown": "^0.8.10",
"tailwind-merge": "^2.5.3",
"tailwindcss": "^3.4.13",
"turndown": "^7.2.0",
@ -86,6 +88,7 @@
"@types/prismjs": "^1.26.4",
"@types/react-calendar-heatmap": "^1.6.7",
"@types/react-slick": "^0.23.13",
"@types/sanitize-html": "^2.13.0",
"@types/turndown": "^5.0.5",
"csv-parser": "^3.0.0",
"gh-pages": "^6.2.0",

File diff suppressed because it is too large Load Diff

@ -633,7 +633,7 @@
]
},
"lhIU0ulpvDAn1Xc3ooYz_": {
"title": "Bias and Fareness",
"title": "Bias and Fairness",
"description": "Bias and fairness in AI refer to the challenges of ensuring that machine learning models do not produce discriminatory or skewed outcomes. Bias can arise from imbalanced training data, flawed assumptions, or biased algorithms, leading to unfair treatment of certain groups based on race, gender, or other factors. Fairness aims to address these issues by developing techniques to detect, mitigate, and prevent biases in AI systems. Ensuring fairness involves improving data diversity, applying fairness constraints during model training, and continuously monitoring models in production to avoid unintended consequences, promoting ethical and equitable AI use.\n\nLearn more from the following resources:",
"links": [
{

@ -2668,7 +2668,7 @@
},
"0btHNkzWL1w_-pUgU_k2y": {
"title": "P = NP",
"description": "The P = NP problem is one of the most famous problems in computer science. It asks if the problem of determining if a given input belongs to a certain class of problems is as hard as the problem of solving the given input. In other words, it asks if the problem of determining if a given input belongs to a certain class of problems is as hard as the problem of determining if a given input belongs to a certain class of problems. This problem is also known as the Halting Problem.\n\nVisit the following resources to learn more:",
"description": "The P = NP problem is one of the most famous problems in computer science. It asks whether a problem that can be solved in polynomial time on a non-deterministic machine (i.e., the problem is in NP) can also be solved in polynomial time on a deterministic machine (i.e., the problem is in P).\n\nIf you can find a polynomial-time solution to an NP-complete problem, then all problems in NP can be solved in polynomial time. This shows that P = NP.\n\nIf you can prove for any single NP-complete problem that it is only solvable in exponential time, then all NP-complete problems are only solvable in exponential time. This shows that P ≠ NP.\n\nSo far, we don't know whether P = NP or P ≠ NP.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Whats P=NP?, and why is it such a famous question?",
@ -3366,8 +3366,19 @@
},
"tcQSH-eAvJUZuePTDjAIb": {
"title": "DML",
"description": "The SQL commands that deals with the manipulation of data present in the database belong to DML or Data Manipulation Language and this includes most of the SQL statements. It is the component of the SQL statement that controls access to data and to the database. Basically, DCL statements are grouped with DML statements.\n\nVisit the following resources to learn more:",
"links": []
"description": "The SQL commands that manipulate data in the database belong to DML, or Data Manipulation Language, and this includes most of the SQL statements. DCL is the component of the SQL statement that controls access to data and to the database. Basically, DCL statements are grouped with DML statements.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "DML: Data Manipulation Language",
"url": "https://satoricyber.com/glossary/dml-data-manipulation-language",
"type": "article"
},
{
"title": "Difference Between DDL and DML",
"url": "https://appmaster.io/blog/difference-between-ddl-and-dml",
"type": "article"
}
]
},
"05lkb3B86Won7Rkf-8DeD": {
"title": "DQL",
@ -3931,7 +3942,7 @@
},
"GDLKJkKgB-i7n0YcV2NDa": {
"title": "How Computers Calculate",
"description": "Visit the following resources to learn more:",
"description": "Computers calculate using the binary system, where all data is represented as 0s and 1s. These binary states correspond to the ON/OFF positions of transistors, which are the building blocks of logic gates (AND, OR, NOT). Numbers, characters, and instructions are broken into binary sequences (bits), and grouped into bytes (8 bits). Arithmetic operations like addition are performed through logic gates, which combine binary values. The CPU executes these calculations by following a fetch-decode-execute cycle. Complex calculations, such as handling decimals, use floating-point representation. Programs written in high-level languages are compiled into machine code for the CPU to execute.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "How computers calculate - ALU",

@ -6,12 +6,18 @@
},
"yCnn-NfSxIybUQ2iTuUGq": {
"title": "What is Data Analytics",
"description": "Data Analytics is a core component of a Data Analyst's role. The field involves extracting meaningful insights from raw data to drive decision-making processes. It includes a wide range of techniques and disciplines ranging from the simple data compilation to advanced algorithms and statistical analysis. As a data analyst, you are expected to understand and interpret complex digital data, such as the usage statistics of a website, the sales figures of a company, or client engagement over social media, etc. This knowledge enables data analysts to support businesses in identifying trends, making informed decisions, predicting potential outcomes - hence playing a crucial role in shaping business strategies.",
"links": []
"description": "Data Analytics is a core component of a Data Analyst's role. The field involves extracting meaningful insights from raw data to drive decision-making processes. It includes a wide range of techniques and disciplines ranging from the simple data compilation to advanced algorithms and statistical analysis. As a data analyst, you are expected to understand and interpret complex digital data, such as the usage statistics of a website, the sales figures of a company, or client engagement over social media, etc. This knowledge enables data analysts to support businesses in identifying trends, making informed decisions, predicting potential outcomes - hence playing a crucial role in shaping business strategies.\n\nLearn more from the following resources:",
"links": [
{
"title": "Introduction to Data Analytics",
"url": "https://www.coursera.org/learn/introduction-to-data-analytics",
"type": "article"
}
]
},
"Lsapbmg-eMIYJAHpV97nO": {
"title": "Types of Data Analytics",
"description": "Data Analytics has proven to be a critical part of decision-making in modern business ventures. It is responsible for discovering, interpreting, and transforming data into valuable information. Different types of data analytics look at past, present, or predictive views of business operations.\n\nData Analysts, as ambassadors of this domain, employ these types, to answer various questions:\n\n* Descriptive Analytics _(what happened in the past?)_\n* Diagnostic Analytics _(why did it happened in the past?)_\n* Predictive Analytics _(what will happen in the future?)_\n* Prescriptive Analytics _(how can we make it happen?)_\n\nUnderstanding these types gives data analysts the power to transform raw datasets into strategic insights.\n\nVisit the following resources to learn more:",
"description": "Data Analytics has proven to be a critical part of decision-making in modern business ventures. It is responsible for discovering, interpreting, and transforming data into valuable information. Different types of data analytics look at past, present, or predictive views of business operations.\n\nData Analysts, as ambassadors of this domain, employ these types, to answer various questions:\n\n* Descriptive Analytics _(what happened in the past?)_\n* Diagnostic Analytics _(why did it happened in the past?)_\n* Predictive Analytics _(what will happen in the future?)_\n* Prescriptive Analytics _(how can we make it happen?)_\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Data Analytics and its type",
@ -72,12 +78,12 @@
"description": "Predictive analysis is a crucial type of data analytics that any competent data analyst should comprehend. It refers to the practice of extracting information from existing data sets in order to determine patterns and forecast future outcomes and trends. Data analysts apply statistical algorithms, machine learning techniques, and artificial intelligence to the data to anticipate future results. Predictive analysis enables organizations to be proactive, forward-thinking, and strategic by providing them valuable insights on future occurrences. It's a powerful tool that gives companies a significant competitive edge by enabling risk management, opportunity identification, and strategic decision-making.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is predictive analytics? - Google",
"title": "What is Predictive Analytics? - Google",
"url": "https://cloud.google.com/learn/what-is-predictive-analytics",
"type": "article"
},
{
"title": "What is predictive analytics?",
"title": "What is Predictive Analytics?",
"url": "https://www.youtube.com/watch?v=cVibCHRSxB0",
"type": "video"
}
@ -125,7 +131,7 @@
"description": "The Cleanup of Data is a critical component of a Data Analyst's role. It involves the process of inspecting, cleaning, transforming, and modeling data to discover useful information, inform conclusions, and support decision making. This process is crucial for Data Analysts to generate accurate and significant insights from data, ultimately resulting in better and more informed business decisions. A solid understanding of data cleanup procedures and techniques is a fundamental skill for any Data Analyst. Hence, it is necessary to hold a high emphasis on maintaining data quality by managing data integrity, accuracy, and consistency during the data cleanup process.\n\nLearn more from the following resources:",
"links": [
{
"title": "Top 10 ways to clean your data",
"title": "Top 10 Ways to Clean Your Data",
"url": "https://support.microsoft.com/en-gb/office/top-ten-ways-to-clean-your-data-2844b620-677c-47a7-ac3e-c2e157d1db19",
"type": "article"
},
@ -157,7 +163,7 @@
"description": "The visualization of data is an essential skill in the toolkit of every data analyst. This practice is about transforming complex raw data into a graphical format that allows for an easier understanding of large data sets, trends, outliers, and important patterns. Whether pie charts, line graphs, bar graphs, or heat maps, data visualization techniques not only streamline data analysis, but also facilitate a more effective communication of the findings to others. This key concept underscores the importance of presenting data in a digestible and visually appealing manner to drive data-informed decision making in an organization.\n\nLearn more from the following resources:",
"links": [
{
"title": "Data visualization beginner's guide",
"title": "Data Visualization Beginner's Guide",
"url": "https://www.tableau.com/en-gb/learn/articles/data-visualization",
"type": "article"
},
@ -223,7 +229,7 @@
},
"yBlJrNo9eO470dLp6OaQZ": {
"title": "DATEDIF",
"description": "The `DATEDIF` function is an incredibly valuable tool for a Data Analyst in Excel or Google Sheets, by providing the ability to calculate the difference between two dates. This function takes in three parameters: start date, end date and the type of difference required (measured in years, months, days, etc.). In Data Analysis, particularly when dealing with time-series data or when you need to uncover trends over specific periods, the `DATEDIF` function is a necessary asset. Recognizing its functionality will enable a data analyst to manipulate or shape data progressively and efficiently.\n\n* `DATEDIF` is technically still supported, but wont show as an option. For additional information, see Excel \"Help\" page.\n\nLearn more from the following resources:",
"description": "The `DATEDIF` function is an incredibly valuable tool for a Data Analyst in Excel or Google Sheets, by providing the ability to calculate the difference between two dates. This function takes in three parameters: start date, end date and the type of difference required (measured in years, months, days, etc.). In Data Analysis, particularly when dealing with time-series data or when you need to uncover trends over specific periods, the `DATEDIF` function is a necessary asset. Recognizing its functionality will enable a data analyst to manipulate or shape data progressively and efficiently.\n\n`DATEDIF` is technically still supported, but wont show as an option. For additional information, see Excel \"Help\" page.\n\nLearn more from the following resources:",
"links": [
{
"title": "DATEDIF function",
@ -407,17 +413,17 @@
"description": "Data Analysts recurrently find the need to summarize, investigate, and analyze their data to make meaningful and insightful decisions. One of the most powerful tools to accomplish this in Microsoft Excel is the Pivot Table. Pivot Tables allow analysts to organize and summarize large quantities of data in a concise, tabular format. The strength of pivot tables comes from their ability to manipulate data dynamically, leading to quicker analysis and richer insights. Understanding and employing Pivot Tables efficiently is a fundamental skill for any data analyst, as it directly impacts their ability to derive significant information from raw datasets.\n\nLearn more from the following resources:",
"links": [
{
"title": "Create a pivot table",
"title": "Create a Pivot Table",
"url": "https://support.microsoft.com/en-gb/office/create-a-pivottable-to-analyze-worksheet-data-a9a84538-bfe9-40a9-a8e9-f99134456576",
"type": "article"
},
{
"title": "Pivot tables in excel",
"title": "Pivot Tables in Excel",
"url": "https://www.excel-easy.com/data-analysis/pivot-tables.html",
"type": "article"
},
{
"title": "How to create a pivot table in excel",
"title": "How to Create a Pivot Table in Excel",
"url": "https://www.youtube.com/watch?v=PdJzy956wo4",
"type": "video"
}
@ -452,15 +458,31 @@
},
"M1QtGTLyygIjePoCfvjve": {
"title": "Data Manipulation Libraries",
"description": "Data manipulation libraries are essential tools in data science and analytics, enabling efficient handling, transformation, and analysis of large datasets. Python, a popular language for data science, offers several powerful libraries for this purpose. Pandas is a highly versatile library that provides data structures like DataFrames, which allow for easy manipulation and analysis of tabular data. NumPy, another fundamental library, offers support for large, multi-dimensional arrays and matrices, along with a collection of mathematical functions to operate on these arrays. Together, Pandas and NumPy form the backbone of data manipulation in Python, facilitating tasks such as data cleaning, merging, reshaping, and statistical analysis, thus streamlining the data preparation process for machine learning and other data-driven applications.",
"links": []
"description": "Data manipulation libraries are essential tools in data science and analytics, enabling efficient handling, transformation, and analysis of large datasets. Python, a popular language for data science, offers several powerful libraries for this purpose. Pandas is a highly versatile library that provides data structures like DataFrames, which allow for easy manipulation and analysis of tabular data. NumPy, another fundamental library, offers support for large, multi-dimensional arrays and matrices, along with a collection of mathematical functions to operate on these arrays. Together, Pandas and NumPy form the backbone of data manipulation in Python, facilitating tasks such as data cleaning, merging, reshaping, and statistical analysis, thus streamlining the data preparation process for machine learning and other data-driven applications.\n\nLearn more from the following resources:",
"links": [
{
"title": "Pandas",
"url": "https://pandas.pydata.org/",
"type": "article"
},
{
"title": "NumPy",
"url": "https://numpy.org/",
"type": "article"
},
{
"title": "Top Python Libraries for Data Science",
"url": "https://www.simplilearn.com/top-python-libraries-for-data-science-article",
"type": "article"
}
]
},
"8OXmF2Gn6TYJotBRvDjqA": {
"title": "Pandas",
"description": "Pandas is a widely acknowledged and highly useful data manipulation library in the world of data analysis. Known for its robust features like data cleaning, wrangling and analysis, pandas has become one of the go-to tools for data analysts. Built on NumPy, it provides high-performance, easy-to-use data structures and data analysis tools. In essence, its flexibility and versatility make it a critical part of the data analyst's toolkit, as it holds the capability to cater to virtually every data manipulation task.\n\nLearn more from the following resources:",
"links": [
{
"title": "Pandas Website",
"title": "Pandas",
"url": "https://pandas.pydata.org/",
"type": "article"
},
@ -473,7 +495,7 @@
},
"l1SnPc4EMqGdaIAhIQfrT": {
"title": "Data Visualisation Libraries",
"description": "Data visualization libraries are crucial in data science for transforming complex datasets into clear and interpretable visual representations, facilitating better understanding and communication of data insights. In Python, several libraries are widely used for this purpose. Matplotlib is a foundational library that offers comprehensive tools for creating static, animated, and interactive plots. Seaborn, built on top of Matplotlib, provides a high-level interface for drawing attractive and informative statistical graphics with minimal code. Plotly is another powerful library that allows for the creation of interactive and dynamic visualizations, which can be easily embedded in web applications. Additionally, libraries like Bokeh and Altair offer capabilities for creating interactive plots and dashboards, enhancing exploratory data analysis and the presentation of data findings. Together, these libraries enable data scientists to effectively visualize trends, patterns, and outliers in their data, making the analysis more accessible and actionable.",
"description": "Data visualization libraries are crucial in data science for transforming complex datasets into clear and interpretable visual representations, facilitating better understanding and communication of data insights. In Python, several libraries are widely used for this purpose. Matplotlib is a foundational library that offers comprehensive tools for creating static, animated, and interactive plots. Seaborn, built on top of Matplotlib, provides a high-level interface for drawing attractive and informative statistical graphics with minimal code. Plotly is another powerful library that allows for the creation of interactive and dynamic visualizations, which can be easily embedded in web applications. Additionally, libraries like Bokeh and Altair offer capabilities for creating interactive plots and dashboards, enhancing exploratory data analysis and the presentation of data findings. Together, these libraries enable data scientists to effectively visualize trends, patterns, and outliers in their data, making the analysis more accessible and actionable.\n\nLearn more from the following resources:",
"links": []
},
"uGkXxdMXUMY-3fQFS1jK8": {
@ -481,7 +503,7 @@
"description": "Matplotlib is a paramount data visualization library used extensively by data analysts for generating a wide array of plots and graphs. Through Matplotlib, data analysts can convey results clearly and effectively, driving insights from complex data sets. It offers a hierarchical environment which is very natural for a data scientist to work with. Providing an object-oriented API, it allows for extensive customization and integration into larger applications. From histograms, bar charts, scatter plots to 3D graphs, the versatility of Matplotlib assists data analysts in the better comprehension and compelling representation of data.\n\nLearn more from the following resources:",
"links": [
{
"title": "Matplotlib Website",
"title": "Matplotlib",
"url": "https://matplotlib.org/",
"type": "article"
},
@ -497,7 +519,7 @@
"description": "Dplyr is a powerful and popular toolkit for data manipulation in R. As a data analyst, this library provides integral functions to manipulate, clean, and process data efficiently. It has been designed to be easy and intuitive, ensuring a robust and consistent syntax. Dplyr ensures data reliability and fast processing, essential for analysts dealing with large datasets. With a strong focus on efficiency, dplyr functions like select, filter, arrange, mutate, summarise, and group\\_by optimise data analysis operations, making data manipulation a smoother and hassle-free procedure for data analysts.\n\nLearn more from the following resources:",
"links": [
{
"title": "dplyr website",
"title": "dplyr",
"url": "https://dplyr.tidyverse.org/",
"type": "article"
},
@ -513,7 +535,7 @@
"description": "When it comes to data visualization in R programming, ggplot2 stands tall as one of the primary tools for data analysts. This data visualization library, which forms part of the tidyverse suite of packages, facilitates the creation of complex and sophisticated visual narratives. With its grammar of graphics philosophy, ggplot2 enables analysts to build graphs and charts layer by layer, thereby offering detailed control over graphical features and design. Its versatility in creating tailored and aesthetically pleasing graphics is a vital asset for any data analyst tackling exploratory data analysis, reporting, or dashboard building.\n\nLearn more from the following resources:",
"links": [
{
"title": "ggplot2 website",
"title": "ggplot2",
"url": "https://ggplot2.tidyverse.org/",
"type": "article"
},
@ -526,21 +548,27 @@
},
"_sjXCLHHTbZromJYn6fnu": {
"title": "Data Collection",
"description": "In the context of the Data Analyst role, data collection is a foundational process that entails gathering relevant data from various sources. This data can be quantitative or qualitative and may be sourced from databases, online platforms, customer feedback, among others. The gathered information is then cleaned, processed, and interpreted to extract meaningful insights. A data analyst performs this whole process carefully, as the quality of data is paramount to ensuring accurate analysis, which in turn informs business decisions and strategies. This highlights the importance of an excellent understanding, proper tools, and precise techniques when it comes to data collection in data analysis.",
"links": []
"description": "Data collection is a foundational process that entails gathering relevant data from various sources. This data can be quantitative or qualitative and may be sourced from databases, online platforms, customer feedback, among others. The gathered information is then cleaned, processed, and interpreted to extract meaningful insights. A data analyst performs this whole process carefully, as the quality of data is paramount to ensuring accurate analysis, which in turn informs business decisions and strategies. This highlights the importance of an excellent understanding, proper tools, and precise techniques when it comes to data collection in data analysis.\n\nLearn more from the following resources:",
"links": [
{
"title": "Data Collection",
"url": "https://en.wikipedia.org/wiki/Data_collection",
"type": "article"
}
]
},
"tYPeLCxbqvMFlTkCGjdHg": {
"title": "Databases",
"description": "Behind every strong data analyst, there's not just a rich assortment of data, but a set of robust databases that enable effective data collection. Databases are a fundamental aspect of data collection in a world where the capability to manage, organize, and evaluate large volumes of data is critical. As a data analyst, the understanding and use of databases is instrumental in capturing the necessary data for conducting qualitative and quantitative analysis, forecasting trends and making data-driven decisions. Thorough knowledge of databases, therefore, can be considered a key component of a data analyst's arsenal. These databases can vary from relational databases like SQL to NoSQL databases like MongoDB, each serving a unique role in the data collection process.\n\nLearn more from the following resources:",
"links": [
{
"title": "PostgreSQL Roadmap",
"url": "https://roadmap.sh/postgresql-dba",
"title": "Visit Dedicated SQL Roadmap",
"url": "https://roadmap.sh/sql",
"type": "article"
},
{
"title": "MongoDB Roadmap",
"url": "https://roadmap.sh/mongodb",
"title": "Visit Dedicated PostgreSQL Roadmap",
"url": "https://roadmap.sh/postgresql-dba",
"type": "article"
}
]
@ -571,7 +599,7 @@
"type": "article"
},
{
"title": "A beginners guide to APIs",
"title": "A Beginner's Guide to APIs",
"url": "https://www.postman.com/what-is-an-api/",
"type": "article"
}
@ -582,12 +610,12 @@
"description": "Web scraping plays a significant role in collecting unique datasets for data analysis. In the realm of a data analyst's tasks, web scraping refers to the method of extracting information from websites and converting it into a structured usable format like a CSV, Excel spreadsheet, or even into databases. This technique allows data analysts to gather large sets of data from the internet, which otherwise could be time-consuming if done manually. The capability of web scraping and parsing data effectively can give data analysts a competitive edge in their data analysis process, from unlocking in-depth, insightful information to making data-driven decisions.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is web scraping what is it used for?",
"title": "What is Web Scraping & What is it used for?",
"url": "https://www.parsehub.com/blog/what-is-web-scraping/",
"type": "article"
},
{
"title": "What is web scraping?",
"title": "What is Web Scraping?",
"url": "https://www.youtube.com/watch?v=dlj_QL-ENJM",
"type": "video"
}
@ -595,8 +623,14 @@
},
"E6cpb6kvluJM8OGuDcFBT": {
"title": "Data Cleanup",
"description": "Data cleaning, which is often referred as data cleansing or data scrubbing, is one of the most important and initial steps in the data analysis process. As a data analyst, the bulk of your work often revolves around understanding, cleaning, and standardizing raw data before analysis. Data cleaning involves identifying, correcting or removing any errors or inconsistencies in datasets in order to improve their quality. The process is crucial because it directly determines the accuracy of the insights you generate - garbage in, garbage out. Even the most sophisticated models and visualizations would not be of much use if they're based on dirty data. Therefore, mastering data cleaning techniques is essential for any data analyst.",
"links": []
"description": "Data cleaning, which is often referred as data cleansing or data scrubbing, is one of the most important and initial steps in the data analysis process. As a data analyst, the bulk of your work often revolves around understanding, cleaning, and standardizing raw data before analysis. Data cleaning involves identifying, correcting or removing any errors or inconsistencies in datasets in order to improve their quality. The process is crucial because it directly determines the accuracy of the insights you generate - garbage in, garbage out. Even the most sophisticated models and visualizations would not be of much use if they're based on dirty data. Therefore, mastering data cleaning techniques is essential for any data analyst.\n\nLearn more from the following resources:",
"links": [
{
"title": "Data Cleaning",
"url": "https://www.tableau.com/learn/articles/what-is-data-cleaning#:~:text=tools%20and%20software-,What%20is%20data%20cleaning%3F,to%20be%20duplicated%20or%20mislabeled.",
"type": "article"
}
]
},
"X9WmfHOks82BIAzs6abqO": {
"title": "Handling Missing Data",
@ -643,7 +677,7 @@
},
"t_BRtEharsrOZxoyX0OzV": {
"title": "Data Transformation",
"description": "Data Transformation, also known as Data Wrangling, is an essential part of a Data Analyst's role. This process involves the conversion of data from a raw format into another format to make it more appropriate and valuable for a variety of downstream purposes such as analytics. Data Analysts transform data to make the data more suitable for analysis, ensure accuracy, and to improve data quality. The right transformation techniques can give the data a structure, multiply its value, and enhance the accuracy of the analytics performed by serving meaningful results.",
"description": "Data Transformation, also known as Data Wrangling, is an essential part of a Data Analyst's role. This process involves the conversion of data from a raw format into another format to make it more appropriate and valuable for a variety of downstream purposes such as analytics. Data Analysts transform data to make the data more suitable for analysis, ensure accuracy, and to improve data quality. The right transformation techniques can give the data a structure, multiply its value, and enhance the accuracy of the analytics performed by serving meaningful results.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is data transformation?",
@ -662,7 +696,7 @@
"description": "In the realms of data analysis, data cleaning is a crucial preliminary process, this is where `pandas` - a popular python library - shines. Primarily used for data manipulation and analysis, pandas adopts a flexible and powerful data structure (DataFrames and Series) that greatly simplifies the process of cleaning raw, messy datasets. Data analysts often work with large volumes of data, some of which may contain missing or inconsistent data that can negatively impact the results of their analysis. By utilizing pandas, data analysts can quickly identify, manage and fill these missing values, drop unnecessary columns, rename column headings, filter specific data, apply functions for more complex data transformations and much more. Thus, making pandas an invaluable tool for effective data cleaning in data analysis.\n\nLearn more from the following resources:",
"links": [
{
"title": "Pandas Website",
"title": "Pandas",
"url": "https://pandas.pydata.org/",
"type": "article"
},
@ -678,7 +712,7 @@
"description": "Data cleaning plays a crucial role in the data analysis pipeline, where it rectifies and enhances the quality of data to increase the efficiency and authenticity of the analytical process. The `dplyr` package, an integral part of the `tidyverse` suite in R, has become a staple in the toolkit of data analysts dealing with data cleaning. `dplyr` offers a coherent set of verbs that significantly simplifies the process of manipulating data structures, such as dataframes and databases. This involves selecting, sorting, filtering, creating or modifying variables, and aggregating records, among other operations. Incorporating `dplyr` into the data cleaning phase enables data analysts to perform operations more effectively, improve code readability, and handle large and complex data with ease.\n\nLearn more from the following resources:",
"links": [
{
"title": "dplyr website",
"title": "dplyr",
"url": "https://dplyr.tidyverse.org/",
"type": "article"
},
@ -790,7 +824,7 @@
"description": "When focusing on data analysis, understanding key statistical concepts is crucial. Amongst these, central tendency is a foundational element. Central Tendency refers to the measure that determines the center of a distribution. The average is a commonly used statistical tool by which data analysts discern trends and patterns. As one of the most recognized forms of central tendency, figuring out the \"average\" involves summing all values in a data set and dividing by the number of values. This provides analysts with a 'typical' value, around which the remaining data tends to cluster, facilitating better decision-making based on existing data.\n\nLearn more from the following resources:",
"links": [
{
"title": "How to calculate the average",
"title": "How to Calculate the Average",
"url": "https://support.microsoft.com/en-gb/office/calculate-the-average-of-a-group-of-numbers-e158ef61-421c-4839-8290-34d7b1e68283#:~:text=Average%20This%20is%20the%20arithmetic,by%206%2C%20which%20is%205.",
"type": "article"
},
@ -806,7 +840,7 @@
"description": "The concept of Range refers to the spread of a dataset, primarily in the realm of statistics and data analysis. This measure is crucial for a data analyst as it provides an understanding of the variability amongst the numbers within a dataset. Specifically in a role such as Data Analyst, understanding the range and dispersion aids in making more precise analyses and predictions. Understanding the dispersion within a range can highlight anomalies, identify standard norms, and form the foundation for statistical conclusions like the standard deviation, variance, and interquartile range. It allows for the comprehension of the reliability and stability of particular datasets, which can help guide strategic decisions in many industries. Therefore, range is a key concept that every data analyst must master.\n\nLearn more from the following resources:",
"links": [
{
"title": "How to find the range of a data set",
"title": "How to Find the Range of a Data Set",
"url": "https://www.scribbr.co.uk/stats/range-statistics/",
"type": "article"
}
@ -817,12 +851,12 @@
"description": "Data analysts heavily rely on statistical concepts to analyze and interpret data, and one such fundamental concept is variance. Variance, an essential measure of dispersion, quantifies the spread of data, providing insight into the level of variability within the dataset. Understanding variance is crucial for data analysts as the reliability of many statistical models depends on the assumption of constant variance across observations. In other words, it helps analysts determine how much data points diverge from the expected value or mean, which can be pivotal in identifying outliers, understanding data distribution, and driving decision-making processes. However, variance can't be interpreted in the original units of measurement due to its squared nature, which is why it is often used in conjunction with its square root, the standard deviation.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is variance?",
"title": "What is Variance?",
"url": "https://www.investopedia.com/terms/v/variance.asp",
"type": "article"
},
{
"title": "How to calculate variance",
"title": "How to Calculate Variance",
"url": "https://www.scribbr.co.uk/stats/variance-meaning/",
"type": "article"
}
@ -892,7 +926,7 @@
"description": "Tableau is a powerful data visualization tool utilized extensively by data analysts worldwide. Its primary role is to transform raw, unprocessed data into an understandable format without any technical skills or coding. Data analysts use Tableau to create data visualizations, reports, and dashboards that help businesses make more informed, data-driven decisions. They also use it to perform tasks like trend analysis, pattern identification, and forecasts, all within a user-friendly interface. Moreover, Tableau's data visualization capabilities make it easier for stakeholders to understand complex data and act on insights quickly.\n\nLearn more from the following resources:",
"links": [
{
"title": "Tableau Website",
"title": "Tableau",
"url": "https://www.tableau.com/en-gb",
"type": "article"
},
@ -908,7 +942,7 @@
"description": "PowerBI, an interactive data visualization and business analytics tool developed by Microsoft, plays a crucial role in the field of a data analyst's work. It helps data analysts to convert raw data into meaningful insights through it's easy-to-use dashboards and reports function. This tool provides a unified view of business data, allowing analysts to track and visualize key performance metrics and make better-informed business decisions. With PowerBI, data analysts also have the ability to manipulate and produce visualizations of large data sets that can be shared across an organization, making complex statistical information more digestible.\n\nLearn more from the following resources:",
"links": [
{
"title": "Power BI Website",
"title": "Power BI",
"url": "https://www.microsoft.com/en-us/power-platform/products/power-bi",
"type": "article"
},
@ -924,7 +958,7 @@
"description": "For a Data Analyst, understanding data and being able to represent it in a visually insightful form is a crucial part of effective decision-making in any organization. Matplotlib, a plotting library for the Python programming language, is an extremely useful tool for this purpose. It presents a versatile framework for generating line plots, scatter plots, histogram, bar charts and much more in a very straightforward manner. This library also allows for comprehensive customizations, offering a high level of control over the look and feel of the graphics it produces, which ultimately enhances the quality of data interpretation and communication.\n\nLearn more from the following resources:",
"links": [
{
"title": "Matplotlib Website",
"title": "Matplotlib",
"url": "https://matplotlib.org/",
"type": "article"
},
@ -940,7 +974,7 @@
"description": "Seaborn is a robust, comprehensive Python library focused on the creation of informative and attractive statistical graphics. As a data analyst, seaborn plays an essential role in elaborating complex visual stories with the data. It aids in understanding the data by providing an interface for drawing attractive and informative statistical graphics. Seaborn is built on top of Python's core visualization library Matplotlib, and is integrated with data structures from Pandas. This makes seaborn an integral tool for data visualization in the data analyst's toolkit, making the exploration and understanding of data easier and more intuitive.\n\nLearn more from the following resources:",
"links": [
{
"title": "Seaborn Website",
"title": "Seaborn",
"url": "https://seaborn.pydata.org/",
"type": "article"
},
@ -972,12 +1006,12 @@
"description": "As a vital tool in the data analyst's arsenal, bar charts are essential for analyzing and interpreting complex data. Bar charts, otherwise known as bar graphs, are frequently used graphical displays for dealing with categorical data groups or discrete variables. With their stark visual contrast and definitive measurements, they provide a simple yet effective means of identifying trends, understanding data distribution, and making data-driven decisions. By analyzing the lengths or heights of different bars, data analysts can effectively compare categories or variables against each other and derive meaningful insights effectively. Simplicity, readability, and easy interpretation are key features that make bar charts a favorite in the world of data analytics.\n\nLearn more from the following resources:",
"links": [
{
"title": "A complete guide to bar charts",
"title": "A Complete Guide to Bar Charts",
"url": "https://www.atlassian.com/data/charts/bar-chart-complete-guide",
"type": "article"
},
{
"title": "What is a bar chart?",
"title": "What is a Bar Chart?",
"url": "https://www.youtube.com/watch?v=WTVdncVCvKo",
"type": "video"
}
@ -1004,7 +1038,7 @@
"description": "A scatter plot, a crucial aspect of data visualization, is a mathematical diagram using Cartesian coordinates to represent values from two different variables. As a data analyst, understanding and interpreting scatter plots can be instrumental in identifying correlations and trends within a dataset, drawing meaningful insights, and showcasing these findings in a clear, visual manner. In addition, scatter plots are paramount in predictive analytics as they reveal patterns which can be used to predict future occurrences.\n\nLearn more from the following resources:",
"links": [
{
"title": "Mastering scatter plots",
"title": "Mastering Scatter Plots",
"url": "https://www.atlassian.com/data/charts/what-is-a-scatter-plot",
"type": "article"
},
@ -1052,7 +1086,7 @@
"description": "A stacked chart is an essential tool for a data analyst in the field of data visualization. This type of chart presents quantitative data in a visually appealing manner and allows users to easily compare different categories while still being able to compare the total sizes. These charts are highly effective when trying to measure part-to-whole relationships, displaying accumulated totals over time or when presenting data with multiple variables. Data analysts often use stacked charts to detect patterns, trends and anomalies which can aid in strategic decision making.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is a stacked chart?",
"title": "What is a Stacked Chart?",
"url": "https://www.spotfire.com/glossary/what-is-a-stacked-chart",
"type": "article"
},
@ -1068,12 +1102,12 @@
"description": "Heatmaps are a crucial component of data visualization that Data Analysts regularly employ in their analyses. As one of many possible graphical representations of data, heatmaps show the correlation or scale of variation between two or more variables in a dataset, making them extremely useful for pattern recognition and outlier detection. Individual values within a matrix are represented in a heatmap as colors, with differing intensities indicating the degree or strength of an occurrence. In short, a Data Analyst would use a heatmap to decode complex multivariate data and turn it into an easily understandable visual that aids in decision making.\n\nLearn more from the following resources:",
"links": [
{
"title": "A complete guide to heatmaps",
"title": "A Complete Guide to Heatmaps",
"url": "https://www.hotjar.com/heatmaps/",
"type": "article"
},
{
"title": "What is a heatmap?",
"title": "What is a Heatmap?",
"url": "https://www.atlassian.com/data/charts/heatmap-complete-guide",
"type": "article"
}
@ -1084,12 +1118,12 @@
"description": "As a data analyst, understanding and efficiently using various forms of data visualization is crucial. Among these, Pie Charts represent a significant tool. Essentially, pie charts are circular statistical graphics divided into slices to illustrate numerical proportions. Each slice of the pie corresponds to a particular category. The pie chart's beauty lies in its simplicity and visual appeal, making it an effective way to convey relative proportions or percentages at a glance. For a data analyst, it's particularly useful when you want to show a simple distribution of categorical data. Like any tool, though, it's important to use pie charts wisely—ideally, when your data set has fewer than seven categories, and the proportions between categories are distinct.\n\nLearn more from the following resources:",
"links": [
{
"title": "A complete guide to pie charts",
"title": "A Complete Guide to Pie Charts",
"url": "https://www.atlassian.com/data/charts/pie-chart-complete-guide",
"type": "article"
},
{
"title": "What is a a pie chart",
"title": "What is a Pie Chart",
"url": "https://www.youtube.com/watch?v=GjJdZaQrItg",
"type": "video"
}
@ -1097,13 +1131,30 @@
},
"2g19zjEASJw2ve57hxpr0": {
"title": "Data Visualisation",
"description": "Data Visualization is a fundamental fragment of the responsibilities of a data analyst. It involves the presentation of data in a graphical or pictorial format which allows decision-makers to see analytics visually. This practice can help them comprehend difficult concepts or establish new patterns. With interactive visualization, data analysts can take the data analysis process to a whole new level — drill down into charts and graphs for more detail, and interactively changing what data is presented or how it’s processed. Thereby it forms a crucial link in the chain of converting raw data to actionable insights which is one of the primary roles of a Data Analyst.",
"links": []
"description": "Data Visualization is a fundamental fragment of the responsibilities of a data analyst. It involves the presentation of data in a graphical or pictorial format which allows decision-makers to see analytics visually. This practice can help them comprehend difficult concepts or establish new patterns. With interactive visualization, data analysts can take the data analysis process to a whole new level — drill down into charts and graphs for more detail, and interactively changing what data is presented or how it’s processed. Thereby it forms a crucial link in the chain of converting raw data to actionable insights which is one of the primary roles of a Data Analyst.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is Data Visualization?",
"url": "https://www.ibm.com/think/topics/data-visualization",
"type": "article"
}
]
},
"TeewVruErSsD4VLXcaDxp": {
"title": "Statistical Analysis",
"description": "Statistical analysis is a core component of a data analyst's toolkit. As professionals dealing with vast amount of structured and unstructured data, data analysts often turn to statistical methods to extract insights and make informed decisions. The role of statistical analysis in data analytics involves gathering, reviewing, and interpreting data for various applications, enabling businesses to understand their performance, trends, and growth potential. Data analysts use a range of statistical techniques from modeling, machine learning, and data mining, to convey vital information that supports strategic company actions.\n\nLearn more from the following resources:",
"links": []
"links": [
{
"title": "Understanding Statistical Analysis",
"url": "https://www.simplilearn.com/what-is-statistical-analysis-article",
"type": "article"
},
{
"title": "Statistical Analysis",
"url": "https://www.youtube.com/watch?v=XjMBZE1DuBY",
"type": "video"
}
]
},
"Xygwu0m5TeYT6S_8FKKXh": {
"title": "Hypothesis Testing",
@ -1155,7 +1206,7 @@
},
"mCUW07rx74_dUNi7OGVlj": {
"title": "Visualizing Distributions",
"description": "Visualising Distributions, from a data analyst's perspective, plays a key role in understanding the overall distribution and identifying patterns within data. It aids in summarising, structuring, and plotting structured data graphically to provide essential insights. This includes using different chart types like bar graphs, histograms, and scatter plots for interval data, and pie or bar graphs for categorical data. Ultimately, the aim is to provide a straightforward and effective manner to comprehend the data's characteristics and underlying structure. A data analyst uses these visualisation techniques to make initial conclusions, detect anomalies, and decide on further analysis paths.\n\nLearn more from the following resources:",
"description": "Visualising Distributions, from a data analyst's perspective, plays a key role in understanding the overall distribution and identifying patterns within data. It aids in summarizing, structuring, and plotting structured data graphically to provide essential insights. This includes using different chart types like bar graphs, histograms, and scatter plots for interval data, and pie or bar graphs for categorical data. Ultimately, the aim is to provide a straightforward and effective manner to comprehend the data's characteristics and underlying structure. A data analyst uses these visualisation techniques to make initial conclusions, detect anomalies, and decide on further analysis paths.\n\nLearn more from the following resources:",
"links": [
{
"title": "Data Visualizations that Capture Distributions",
@ -1195,12 +1246,12 @@
"description": "Unsupervised learning, as a fundamental aspect of Machine Learning, holds great implications in the realm of data analytics. It is an approach where a model learns to identify patterns and relationships within a dataset that isn't labelled or classified. It is especially useful for a Data Analyst as it can assist in recognizing unforeseen trends, providing new insights or preparing data for other machine learning tasks. This ability to infer without direct supervision allows a vast potential for latent structure discovery and new knowledge derivation from raw data.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is unsupervised learning?",
"title": "What is Unsupervised Learning?",
"url": "https://cloud.google.com/discover/what-is-unsupervised-learning",
"type": "article"
},
{
"title": "Introduction to unsupervised learning",
"title": "Introduction to Unsupervised Learning",
"url": "https://www.datacamp.com/blog/introduction-to-unsupervised-learning",
"type": "article"
}
@ -1211,7 +1262,7 @@
"description": "Supervised machine learning forms an integral part of the toolset for a Data Analyst. With a direct focus on building predictive models from labeled datasets, it involves training an algorithm based on these known inputs and outputs, helping Data Analysts establish correlations and make reliable predictions. Fortifying a Data Analyst's role, supervised machine learning enables the accurate interpretation of complex data, enhancing decision-making processes.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is supervised learning?",
"title": "What is Supervised Learning?",
"url": "https://cloud.google.com/discover/what-is-supervised-learning",
"type": "article"
},
@ -1275,12 +1326,12 @@
"description": "As a data analyst, it's crucial to understand various model evaluation techniques. These techniques involve different methods to measure the performance or accuracy of machine learning models. For instance, using confusion matrix, precision, recall, F1 score, ROC curves or Root Mean Squared Error (RMSE) among others. Knowing how to apply these techniques effectively not only helps in selecting the best model for a specific problem but also guides in tuning the performance of the models for optimal results. Understanding these model evaluation techniques also allows data analysts to interpret evaluation results and determine the effectiveness and applicability of a model.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is model evaluation",
"title": "What is Model Evaluation",
"url": "https://domino.ai/data-science-dictionary/model-evaluation",
"type": "article"
},
{
"title": "Model evaluation metrics",
"title": "Model Evaluation Metrics",
"url": "https://www.markovml.com/blog/model-evaluation-metrics",
"type": "article"
}
@ -1288,8 +1339,14 @@
},
"_aUQZWUhFRvNu0MZ8CPit": {
"title": "Big Data Technologies",
"description": "In the modern digitized world, Big Data refers to extremely large datasets that are challenging to manage and analyze using traditional data processing applications. These datasets often come from numerous different sources and are not only voluminous but also diverse in nature, including structured and unstructured data. The role of a data analyst in the context of big data is crucial. Data analysts are responsible for inspecting, cleaning, transforming, and modeling big data to discover useful information, conclude and support decision-making. They leverage their analytical skills and various big data tools and technologies to extract insights that can benefit the organization and drive strategic business initiatives.",
"links": []
"description": "In the modern digitized world, Big Data refers to extremely large datasets that are challenging to manage and analyze using traditional data processing applications. These datasets often come from numerous different sources and are not only voluminous but also diverse in nature, including structured and unstructured data. The role of a data analyst in the context of big data is crucial. Data analysts are responsible for inspecting, cleaning, transforming, and modeling big data to discover useful information, conclude and support decision-making. They leverage their analytical skills and various big data tools and technologies to extract insights that can benefit the organization and drive strategic business initiatives.\n\nLearn more from the following resources:",
"links": [
{
"title": "Big Data Analytics",
"url": "https://www.ibm.com/think/topics/big-data-analytics",
"type": "article"
}
]
},
"m1IfG2sEedUxMXrv_B8GW": {
"title": "Big Data Concepts",
@ -1360,7 +1417,7 @@
"description": "Hadoop is a critical element in the realm of data processing frameworks, offering an effective solution for storing, managing, and analyzing massive amounts of data. Unraveling meaningful insights from a large deluge of data is a challenging pursuit faced by many data analysts. Regular data processing tools fail to handle large-scale data, paving the way for advanced frameworks like Hadoop. This open-source platform by Apache Software Foundation excels at storing and processing vast data across clusters of computers. Notably, Hadoop comprises two key modules - the Hadoop Distributed File System (HDFS) for storage and MapReduce for processing. Hadoop’s ability to handle both structured and unstructured data further broadens its capacity. For any data analyst, a thorough understanding of Hadoop can unlock powerful ways to manage data effectively and construct meaningful analytics.\n\nLearn more from the following resources:",
"links": [
{
"title": "Apache Hadoop Website",
"title": "Apache Hadoop",
"url": "https://hadoop.apache.org/",
"type": "article"
},
@ -1381,7 +1438,7 @@
"type": "opensource"
},
{
"title": "Apache Spark Website",
"title": "Apache Spark",
"url": "https://spark.apache.org/",
"type": "article"
}
@ -1421,15 +1478,21 @@
},
"SiYUdtYMDImRPmV2_XPkH": {
"title": "Deep Learning (Optional)",
"description": "Deep learning, a subset of machine learning technique, is increasingly becoming a critical tool for data analysts. Deep learning algorithms utilize multiple layers of neural networks to understand and interpret intricate structures in large data, a skill that is integral to the daily functions of a data analyst. With the ability to learn from unstructured or unlabeled data, deep learning opens a whole new range of possibilities for data analysts in terms of data processing, prediction, and categorization. It has applications in a variety of industries from healthcare to finance to e-commerce and beyond. A deeper understanding of deep learning methodologies can augment a data analyst's capability to evaluate and interpret complex datasets and provide valuable insights for decision making.",
"links": []
"description": "Deep learning, a subset of machine learning technique, is increasingly becoming a critical tool for data analysts. Deep learning algorithms utilize multiple layers of neural networks to understand and interpret intricate structures in large data, a skill that is integral to the daily functions of a data analyst. With the ability to learn from unstructured or unlabeled data, deep learning opens a whole new range of possibilities for data analysts in terms of data processing, prediction, and categorization. It has applications in a variety of industries from healthcare to finance to e-commerce and beyond. A deeper understanding of deep learning methodologies can augment a data analyst's capability to evaluate and interpret complex datasets and provide valuable insights for decision making.\n\nLearn more from the following resources:",
"links": [
{
"title": "Deep Learning for Data Analysis",
"url": "https://www.ibm.com/think/topics/deep-learning",
"type": "article"
}
]
},
"gGHsKcS92StK5FolzmVvm": {
"title": "Neural Networks",
"description": "Neural Networks play a pivotal role in the landscape of deep learning, offering a plethora of benefits and applications for data analysts. They are computational models that emulate the way human brain processes information, enabling machines to make intelligent decisions. As a data analyst, understanding and utilizing neural networks can greatly enhance decision-making process as it allows to quickly and effectively analyze large datasets, recognize patterns, and forecast future trends. In deep learning, these networks are used for creating advanced models that can tackle complex tasks such as image recognition, natural language processing, and speech recognition, to name but a few. Therefore, an in-depth knowledge of neural networks is a significant asset for any aspiring or professional data analyst.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is a neural network?",
"title": "What is a Neural Network?",
"url": "https://aws.amazon.com/what-is/neural-network/",
"type": "article"
},
@ -1461,12 +1524,12 @@
"description": "Recurrent Neural Networks(RNNs) are a type of Artificial Neural Networks(ANNs) which introduces us to the realm of Deep Learning, an aspect that has been significantly contributing to the evolution of Data Analysis. RNNs are specifically designed to recognize patterns in sequences of data, such as text, genomes, handwriting, or the spoken word. This inherent feature of RNNs makes them extremely useful and versatile for a data analyst.\n\nA data analyst leveraging RNNs can effectively charter the intrinsic complexity of data sequences, classify them, and make accurate predictions. With the fundamental understanding of deep learning, data analysts can unlock the full potential of RNNs in delivering insightful data analysis that goes beyond traditional statistical methods. Modern research and applications of RNNs extend to multiple domains including natural language processing, speech recognition, and even in the financial sphere for stock price prediction making this a key tool in a data analyst’s arsenal.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is a recurrent neural network (RNN)?",
"title": "What is a Recurrent Neural Network (RNN)?",
"url": "https://www.ibm.com/topics/recurrent-neural-networks",
"type": "article"
},
{
"title": "Recurrent Neural Networks cheatsheet",
"title": "Recurrent Neural Networks Cheat-sheet",
"url": "https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks",
"type": "article"
}
@ -1477,10 +1540,15 @@
"description": "TensorFlow, developed by Google Brain Team, has become a crucial tool in the realm of data analytics, particularly within the field of deep learning. It's an open-source platform for machine learning, offering a comprehensive and flexible ecosystem of tools, libraries, and community resources. As a data analyst, understanding and implementing TensorFlow for deep learning models allows us to identify complex patterns and make insightful predictions which standard analysis could miss. It's in-demand skill that enhances our ability to generate accurate insights from colossal and complicated structured or unstructured data sets.\n\nLearn more from the following resources:",
"links": [
{
"title": "Tensorflow Website",
"title": "Tensorflow",
"url": "https://www.tensorflow.org/",
"type": "article"
},
{
"title": "Tensorflow Documentation",
"url": "https://www.tensorflow.org/learn",
"type": "article"
},
{
"title": "Tensorflow in 100 seconds",
"url": "https://www.youtube.com/watch?v=i8NETqtGHms",
@ -1493,10 +1561,15 @@
"description": "PyTorch, an open-source machine learning library, has gained considerable popularity among data analysts due to its simplicity and high performance in tasks such as natural language processing and artificial intelligence. Specifically, in the domain of deep learning, PyTorch stands out due to its dynamic computational graph, allowing for a highly intuitive and flexible platform for building complex models. For data analysts, mastering PyTorch can open up a broad range of opportunities for data model development, data processing, and integration of machine learning algorithms.\n\nLearn more from the following resources:",
"links": [
{
"title": "PyTorch Website",
"title": "PyTorch",
"url": "https://pytorch.org/",
"type": "article"
},
{
"title": "PyTorch Documentation",
"url": "https://pytorch.org/docs/stable/index.html",
"type": "article"
},
{
"title": "PyTorch in 100 seconds",
"url": "https://www.youtube.com/watch?v=ORMx45xqWkA",
@ -1509,7 +1582,7 @@
"description": "Image Recognition has become a significant domain because of its diverse applications, including facial recognition, object detection, character recognition, and much more. As a Data Analyst, understanding Image Recognition under Deep Learning becomes crucial. The data analyst's role in this context involves deciphering complex patterns and extracting valuable information from image data. This area of machine learning combines knowledge of data analysis, image processing, and deep neural networks to provide accurate results, contributing significantly to the progression of fields like autonomous vehicles, medical imaging, surveillance, among others. Therefore, proficiency in this field paves the way for proficient data analysis, leading to innovative solutions and improved decision-making.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is image recognition?",
"title": "What is Image Recognition?",
"url": "https://www.techtarget.com/searchenterpriseai/definition/image-recognition",
"type": "article"
},
@ -1541,12 +1614,12 @@
"description": "As a business enterprise expands, so does its data. For data analysts, the surge in information means they need efficient and scalable data storage solutions to manage vast volumes of structured and unstructured data, collectively referred to as Big Data. Big Data storage solutions are critical in preserving the integrity of data while also providing quick and easy access to the data when needed. These solutions use software and hardware components to securely store massive amounts of information across numerous servers, allowing data analysts to perform robust data extraction, data processing and complex data analyses. There are several options, from the traditional Relational Database Management Systems (RDBMS) to the more recent NoSQL databases, Hadoop ecosystems, and Cloud storage solutions, each offering unique capabilities and benefits to cater for different big data needs.\n\nLearn more from the following resources:",
"links": [
{
"title": "SQL Roadmap",
"title": "Visit Dedicated SQL Roadmap",
"url": "https://roadmap.sh/sql",
"type": "article"
},
{
"title": "PostgreSQL Roadmap",
"title": "Visit Dedicated PostgreSQL Roadmap",
"url": "https://roadmap.sh/postgresql-dba",
"type": "article"
}

@ -170,6 +170,11 @@
"title": "HTML",
"description": "HTML (Hypertext Markup Language) is the standard markup language used to create web pages and web applications. It provides a structure for content on the World Wide Web, using a system of elements and attributes to define the layout and content of a document. HTML elements are represented by tags, which browsers interpret to render the visual and auditory elements of a web page. The language has evolved through several versions, with HTML5 being the current standard, introducing semantic elements, improved multimedia support, and enhanced form controls. HTML works in conjunction with CSS for styling and JavaScript for interactivity, forming the foundation of modern web development.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Responsive Web Design Certification - Co-Learn HTML & CSS with guided projects",
"url": "https://www.freecodecamp.org/learn/2022/responsive-web-design/",
"type": "course"
},
{
"title": "W3Schools: Learn HTML",
"url": "https://www.w3schools.com/html/html_intro.asp",
@ -321,6 +326,11 @@
"url": "https://www.youtube.com/watch?v=G3e-cpL7ofc",
"type": "course"
},
{
"title": "Responsive Web Design Certification - Co-Learn HTML & CSS with guided projects",
"url": "https://www.freecodecamp.org/learn/2022/responsive-web-design/",
"type": "course"
},
{
"title": "Web.dev by Google — Learn CSS",
"url": "https://web.dev/learn/css/",
@ -1607,11 +1617,6 @@
"url": "https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP",
"type": "article"
},
{
"title": "Google Devs Content Security Policy (CSP)",
"url": "https://developers.google.com/web/fundamentals/security/csp",
"type": "article"
},
{
"title": "Web.dev - Content Security Policy (CSP)",
"url": "https://web.dev/csp/",

@ -712,14 +712,8 @@
},
"J9yIXZTtwbFzH2u4dI1ep": {
"title": "CSRF Protection",
"description": "Cross-Site Request Forgery (CSRF) Protection in PHP is a method where a website can defend itself against unwanted actions performed on behalf of the users without their consent. It's a critical aspect of security as it safeguards users against potential harmful activities. Here's an example: if users are logged into a website and get tricked into clicking a deceitful link, CSRF attacks could be triggered. To protect your PHP applications from such attacks, you can generate a unique token for every session and include it as a hidden field for all form submissions. Afterwards, you need to verify this token on the server side before performing any action.\n\n <?php\n // Generate CSRF token\n if(empty($_SESSION['csrf'])) {\n $_SESSION['csrf'] = bin2hex(random_bytes(32));\n }\n \n // Verify CSRF token\n if(isset($_POST['csrf']) && $_POST['csrf'] === $_SESSION['csrf']) {\n // valid CSRF token, perform action\n }\n ?>\n \n\nVisit the following resources to learn more:",
"links": [
{
"title": "Security Guide",
"url": "https://php.net/manual/en/security.csrf.php",
"type": "article"
}
]
"description": "Cross-Site Request Forgery (CSRF) Protection in PHP is a method where a website can defend itself against unwanted actions performed on behalf of the users without their consent. It's a critical aspect of security as it safeguards users against potential harmful activities. Here's an example: if users are logged into a website and get tricked into clicking a deceitful link, CSRF attacks could be triggered. To protect your PHP applications from such attacks, you can generate a unique token for every session and include it as a hidden field for all form submissions. Afterwards, you need to verify this token on the server side before performing any action.\n\n <?php\n // Generate CSRF token\n if(empty($_SESSION['csrf'])) {\n $_SESSION['csrf'] = bin2hex(random_bytes(32));\n }\n \n // Verify CSRF token\n if(isset($_POST['csrf']) && $_POST['csrf'] === $_SESSION['csrf']) {\n // valid CSRF token, perform action\n }\n ?>\n \n\nVisit the following resources to learn more:\n\n* \\[@article@PHP Tutorial CSRF\\] ([https://www.phptutorial.net/php-tutorial/php-csrf/](https://www.phptutorial.net/php-tutorial/php-csrf/))",
"links": []
},
"JbWFfJiCRrXDhnuIx_lqx": {
"title": "Password Hashing",

@ -583,7 +583,7 @@
"links": [
{
"title": "pg_ctlcluster",
"url": "https://www.postgresql.org/docs/current/pgctlcluster.html",
"url": "https://manpages.ubuntu.com/manpages/focal/man1/pg_ctlcluster.1.html",
"type": "article"
}
]
@ -1075,7 +1075,7 @@
]
},
"S20aJB-VuSpXYyd0-0S8c": {
"title": "Object Priviliges",
"title": "Object Privileges",
"description": "Object privileges in PostgreSQL are the permissions given to different user roles to access or modify database objects like tables, views, sequences, and functions. Ensuring proper object privileges is crucial for maintaining a secure and well-functioning database.\n\nLearn more from the following resources:",
"links": [
{
@ -1117,7 +1117,7 @@
]
},
"t18XjeHP4uRyERdqhHpl5": {
"title": "Default Priviliges",
"title": "Default Privileges",
"description": "PostgreSQL allows you to define object privileges for various types of database objects. These privileges determine if a user can access and manipulate objects like tables, views, sequences, or functions. In this section, we will focus on understanding default privileges in PostgreSQL.\n\nLearn more from the following resources:",
"links": [
{

@ -399,8 +399,19 @@
},
"gS3ofDrqDRKbecIskIyGi": {
"title": "Product Roadmap",
"description": "The product roadmap is a strategic document that provides a detailed overview of the product's direction and vision. It outlines the product's plans, both tactical and strategic - including the specific steps necessary to achieve the company's goals and vision. As a Product Manager, you are expected to guide the creation of the product roadmap, communicating the product’s evolution to the team, stakeholders, and customers. This tool serves as an essential reference point helping to align all stakeholders with the key priorities and vision of the product, and acts as a guide for decisions around product development.",
"links": []
"description": "The product roadmap is a strategic document that provides a detailed overview of the product's direction and vision. It outlines the product's plans, both tactical and strategic - including the specific steps necessary to achieve the company's goals and vision. As a Product Manager, you are expected to guide the creation of the product roadmap, communicating the product’s evolution to the team, stakeholders, and customers. This tool serves as an essential reference point helping to align all stakeholders with the key priorities and vision of the product, and acts as a guide for decisions around product development.\n\nLearn more from the following resources:",
"links": [
{
"title": "What is a Product Roadmap? - Product Plan",
"url": "https://www.productplan.com/learn/what-is-a-product-roadmap/",
"type": "article"
},
{
"title": "What is a Product Roadmap? - Vibhor Chandel",
"url": "https://www.youtube.com/watch?v=BJR70jnpHog&ab_channel=VibhorChandel",
"type": "video"
}
]
},
"eiqV86PWizZPWsyqoBU5k": {
"title": "Creating a Roadmap",

@ -179,6 +179,21 @@
"title": "Tuples Documentation",
"url": "https://docs.python.org/3/tutorial/datastructures.html#tuples-and-sequences",
"type": "article"
},
{
"title": "When and How to Use Tuples",
"url": "https://thenewstack.io/python-for-beginners-when-and-how-to-use-tuples/",
"type": "article"
},
{
"title": "Python's tuple Data Type: A Deep Dive With Examples",
"url": "https://realpython.com/python-tuple/#getting-started-with-pythons-tuple-data-type",
"type": "article"
},
{
"title": "why are Tuples even a thing?",
"url": "https://www.youtube.com/watch?v=fR_D_KIAYrE",
"type": "video"
}
]
},
@ -205,7 +220,7 @@
},
"bc9CL_HMT-R6nXO1eR-gP": {
"title": "Dictionaries",
"description": "In Python, a dictionary is a built-in data type that allows you to store key-value pairs. Each key in the dictionary is unique, and each key is associated with a value. Dictionaries are unordered collections, meaning the order of items is not guaranteed.\n\nLearn more from the following resources:",
"description": "In Python, a dictionary is a built-in data type that allows you to store key-value pairs. Each key in the dictionary is unique, and each key is associated with a value. Starting from Python 3.7, dictionaries maintain the order of items as they were added.\n\nLearn more from the following resources:",
"links": [
{
"title": "Dictionaries in Python",
@ -216,6 +231,11 @@
"title": "W3 Schools - Dictionaries",
"url": "https://www.w3schools.com/python/python_dictionaries.asp",
"type": "article"
},
{
"title": "Dictionaries in Python",
"url": "https://realpython.com/python-dicts/",
"type": "article"
}
]
},
@ -596,6 +616,11 @@
"title": "Python Iterators",
"url": "https://www.programiz.com/python-programming/iterator",
"type": "article"
},
{
"title": "Iterators and Iterables in Python",
"url": "https://realpython.com/python-iterators-iterables/",
"type": "article"
}
]
},

@ -496,7 +496,7 @@
},
"_U0VoTkqM1d6NR13p5azS": {
"title": "Patterns & Design Principles",
"description": "",
"description": "In the realm of software architecture, patterns and design principles are foundational tools that enable architects to create robust, scalable, and maintainable systems. They offer proven solutions to common problems and guide decision-making throughout the software development lifecycle. Understanding these concepts is essential for anyone following a software architect roadmap, as they bridge the gap between high-level architecture and practical implementation.",
"links": []
},
"AMDLJ_Bup-AY1chl_taV3": {

@ -557,11 +557,11 @@
},
"fY8zgbB13wxZ1CFtMSdZZ": {
"title": "SQL Tuning",
"description": "SQL tuning is a broad topic and many books have been written as reference. It's important to benchmark and profile to simulate and uncover bottlenecks.\n\n* Benchmark - Simulate high-load situations with tools such as ab.\n* Profile - Enable tools such as the slow query log to help track performance issues.\n\nBenchmarking and profiling might point you to the following optimizations.\n\nTo learn more, visit the following links:",
"description": "SQL tuning is the attempt to diagnose and repair SQL statements that fail to meet a performance standard. It is a broad topic and many books have been written as reference. It's important to benchmark and profile to simulate and uncover bottlenecks.\n\n* Benchmark - Simulate high-load situations with tools such as ab.\n* Profile - Enable tools such as the slow query log to help track performance issues.\n\nBenchmarking and profiling might point you to the following optimizations.\n\nTo learn more, visit the following links:",
"links": [
{
"title": "Optimizing MySQL Queries",
"url": "https://aiddroid.com/10-tips-optimizing-mysql-queries-dont-suck/",
"title": "Introduction to SQL Tuning - Oracle",
"url": "https://docs.oracle.com/en/database/oracle/oracle-database/23/tgsql/introduction-to-sql-tuning.html#GUID-B653E5F3-F078-4BBC-9516-B892960046A2",
"type": "article"
},
{

@ -387,8 +387,14 @@
},
"90_M5qABC1vZ1nsXVyqFJ": {
"title": "Good Layout Rules",
"description": "In the world of UX design, a good layout is crucial to ensure your prototype is intuitive and user-friendly. By following these good layout rules, you can ensure your designs are efficient, attractive, and easy to navigate for users.\n\nConsistency\n-----------\n\nBeing consistent with your design is vital in creating an easy-to-navigate interface. Utilize the same color schemes, typography, and other design elements consistently throughout your prototype to make it visually cohesive and user-friendly.\n\nAlignment and Spacing\n---------------------\n\nEnsure all the elements on your prototype are aligned and spaced properly. This helps create a well-structured and clean look, while also making it easy for users to navigate and understand your design.\n\nVisual Hierarchy\n----------------\n\nEstablish clear visual hierarchy by using size, color, contrast, and white space effectively. This helps users identify important elements on the screen quickly and understand the flow of your design easily.\n\nGrouping of Elements\n--------------------\n\nGroup related elements together, such as navigation menus or form input fields. This helps users recognize the purpose and function of each section more quickly and intuitively.\n\nBalance and Proportion\n----------------------\n\nCreate a balanced and proportional look by distributing elements on the screen evenly. This can be achieved through the use of grids or other layout techniques that help maintain a sense of harmony and order in your design.\n\nAccessibility\n-------------\n\nEnsure your design is accessible to all users by considering factors such as text size, contrast, and color combinations. Aim to create an inclusive prototype that caters to people of different abilities and preferences.\n\nResponsiveness and Flexibility\n------------------------------\n\nMake sure your prototype can adapt to different screen sizes and devices, ensuring a seamless user experience across various platforms. This is particularly important when designing for web and mobile applications.\n\nIterating and Testing\n---------------------\n\nAs you develop your design, continually test and iterate on your layout based on user feedback and data. This process will help refine your design and ensure it meets the needs and expectations of your users.\n\nBy incorporating these good layout rules into your prototyping process, you'll be well on your way to creating a user-friendly and effective design that meets the goals and objectives of your project.",
"links": []
"description": "In the world of UX design, a good layout is crucial to ensure your prototype is intuitive and user-friendly. By following these good layout rules, you can ensure your designs are efficient, attractive, and easy to navigate for users.\n\nConsistency\n-----------\n\nBeing consistent with your design is vital in creating an easy-to-navigate interface. Utilize the same color schemes, typography, and other design elements consistently throughout your prototype to make it visually cohesive and user-friendly.\n\nAlignment and Spacing\n---------------------\n\nEnsure all the elements on your prototype are aligned and spaced properly. This helps create a well-structured and clean look, while also making it easy for users to navigate and understand your design.\n\nVisual Hierarchy\n----------------\n\nEstablish clear visual hierarchy by using size, color, contrast, and white space effectively. This helps users identify important elements on the screen quickly and understand the flow of your design easily.\n\nGrouping of Elements\n--------------------\n\nGroup related elements together, such as navigation menus or form input fields. This helps users recognize the purpose and function of each section more quickly and intuitively.\n\nBalance and Proportion\n----------------------\n\nCreate a balanced and proportional look by distributing elements on the screen evenly. This can be achieved through the use of grids or other layout techniques that help maintain a sense of harmony and order in your design.\n\nAccessibility\n-------------\n\nEnsure your design is accessible to all users by considering factors such as text size, contrast, and color combinations. Aim to create an inclusive prototype that caters to people of different abilities and preferences.\n\nResponsiveness and Flexibility\n------------------------------\n\nMake sure your prototype can adapt to different screen sizes and devices, ensuring a seamless user experience across various platforms. This is particularly important when designing for web and mobile applications.\n\nIterating and Testing\n---------------------\n\nAs you develop your design, continually test and iterate on your layout based on user feedback and data. This process will help refine your design and ensure it meets the needs and expectations of your users.\n\nBy incorporating these good layout rules into your prototyping process, you'll be well on your way to creating a user-friendly and effective design that meets the goals and objectives of your project.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "User Interface Design Guidelines: 10 Rules of Thumb",
"url": "https://www.interaction-design.org/literature/article/user-interface-design-guidelines-10-rules-of-thumb",
"type": "article"
}
]
},
"t46s6Piyd8MoJYzdDTsjr": {
"title": "Figma",

@ -234,6 +234,11 @@
"title": "Global Properties",
"description": "Global properties allows you to add properties or methods that can be accessed throughout your application. This is particularly useful for sharing functionality or data across components without the need to pass props explicitly.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Application API - globalProperties",
"url": "https://vuejs.org/api/application.html#app-config-globalproperties",
"type": "article"
},
{
"title": "Vue.js Global Properties",
"url": "https://blog.logrocket.com/vue-js-globalproperties/",
@ -544,7 +549,7 @@
"links": [
{
"title": "Modifiers",
"url": "https://v2.vuejs.org/v2/guide/components-custom-events.html",
"url": "https://vuejs.org/guide/essentials/forms.html#modifiers",
"type": "article"
}
]
@ -580,9 +585,14 @@
"title": "Inline / Method Handlers",
"description": "In Vue.js, **inline handlers** are defined directly in the template using expressions, making them suitable for simple tasks. For example, you might use an inline handler to increment a counter. **Method handlers**, on the other hand, are defined in the `methods` option and are better for more complex logic or when reusing functionality across multiple components. They improve code readability and maintainability.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Inline Handlers",
"url": "https://vuejs.org/guide/essentials/event-handling#inline-handlers",
"type": "article"
},
{
"title": "Method Handlers",
"url": "https://v1.vuejs.org/guide/events.html",
"url": "https://vuejs.org/guide/essentials/event-handling#method-handlers",
"type": "article"
}
]
@ -591,6 +601,11 @@
"title": "Event Modifiers",
"description": "In Vue.js, event modifiers are special postfixes that you can add to event handlers to control the behavior of events more easily. They help simplify common tasks such as stopping propagation, preventing default actions, and ensuring that the event is triggered only under certain conditions.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Event Modifiers",
"url": "https://vuejs.org/guide/essentials/event-handling#event-modifiers",
"type": "article"
},
{
"title": "Event Modifiers in Vue.js",
"url": "https://www.freecodecamp.org/news/how-event-handling-works-in-vue-3-guide-for-devs/",
@ -603,8 +618,8 @@
"description": "Input bindings are a way to bind user input to a component's data. This allows the component to react to user input and update its state accordingly. Input bindings are typically used with form elements such as text inputs, checkboxes, and select dropdowns.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Input Bindings",
"url": "https://vuejs.org/guide/essentials/forms",
"title": "Key Modifiers",
"url": "https://vuejs.org/guide/essentials/event-handling#key-modifiers",
"type": "article"
}
]
@ -613,6 +628,11 @@
"title": "Mouse Button Modifiers",
"description": "Mouse button modifiers are a type of modifier that can be used with event handlers to specify which mouse button or buttons should trigger the event. These modifiers allow you to customize the behavior of event handlers, such as v-on:click, to respond to specific mouse button clicks.\n\nVisit the following resources to learn more:",
"links": [
{
"title": "Mouse Button Modifiers",
"url": "https://vuejs.org/guide/essentials/event-handling#mouse-button-modifiers",
"type": "article"
},
{
"title": "Button Modifiers",
"url": "https://medium.com/evolve-you/vue-3-keyboard-and-mouse-a4866d7d0e8",

@ -86,9 +86,6 @@ export function AdvertiseForm() {
pageProgressMessage.set('Please wait');
// Placeholder function to send data
console.log('Form data:', formData);
const { response, error } = await httpPost(
`${import.meta.env.PUBLIC_API_URL}/v1-advertise`,
formData,

@ -1,6 +1,5 @@
---
import { parse } from 'node-html-parser';
import type { Attributes } from 'node-html-parser/dist/nodes/html';
export interface Props {
icon: string;
@ -15,7 +14,6 @@ async function getSVG(name: string) {
eager: true,
});
if (!(filepath in files)) {
throw new Error(`${filepath} not found`);
}

@ -0,0 +1,143 @@
import { useEffect, useState } from 'react';
import { Modal } from '../Modal';
import { GitHubButton } from './GitHubButton';
import { GoogleButton } from './GoogleButton';
import { LinkedInButton } from './LinkedInButton';
import { EmailLoginForm } from './EmailLoginForm';
import { EmailSignupForm } from './EmailSignupForm';
type CourseLoginPopupProps = {
onClose: () => void;
checkoutAfterLogin?: boolean;
};
export const CHECKOUT_AFTER_LOGIN_KEY = 'checkoutAfterLogin';
export function CourseLoginPopup(props: CourseLoginPopupProps) {
const { onClose: parentOnClose, checkoutAfterLogin = true } = props;
const [isDisabled, setIsDisabled] = useState(false);
const [isUsingEmail, setIsUsingEmail] = useState(false);
const [emailNature, setEmailNature] = useState<'login' | 'signup' | null>(
null,
);
function onClose() {
// if user didn't login and closed the popup, we remove the checkoutAfterLogin flag
// so that login from other buttons on course page will trigger purchase
localStorage.removeItem(CHECKOUT_AFTER_LOGIN_KEY);
parentOnClose();
}
useEffect(() => {
localStorage.setItem(
CHECKOUT_AFTER_LOGIN_KEY,
checkoutAfterLogin ? '1' : '0',
);
}, [checkoutAfterLogin]);
if (emailNature) {
const emailHeader = (
<div className="mb-7 text-center">
<p className="mb-3.5 pt-2 text-2xl font-semibold leading-5 text-slate-900">
{emailNature === 'login'
? 'Login to your account'
: 'Create an account'}
</p>
<p className="mt-2 text-sm leading-4 text-slate-600">
Fill in the details below to continue
</p>
</div>
);
return (
<Modal onClose={onClose} bodyClassName="p-5 h-auto">
{emailHeader}
{emailNature === 'login' && (
<EmailLoginForm
isDisabled={isDisabled}
setIsDisabled={setIsDisabled}
/>
)}
{emailNature === 'signup' && (
<EmailSignupForm
isDisabled={isDisabled}
setIsDisabled={setIsDisabled}
/>
)}
<button
className="mt-2 w-full rounded-md border border-gray-400 py-2 text-center text-sm text-gray-600 hover:bg-gray-100"
onClick={() => setEmailNature(null)}
>
Back to Options
</button>
</Modal>
);
}
return (
<Modal onClose={onClose} bodyClassName="p-5 h-auto">
<div className="mb-7 text-center">
<p className="mb-3.5 pt-2 text-2xl font-semibold leading-5 text-slate-900">
Create or login to your account
</p>
<p className="mt-2 text-sm leading-4 text-slate-600">
Login or sign up for an account to start learning
</p>
</div>
<div className="flex w-full flex-col gap-2">
<GitHubButton
className="rounded-md border-gray-400 hover:bg-gray-100"
isDisabled={isDisabled}
setIsDisabled={setIsDisabled}
/>
<GoogleButton
className="rounded-md border-gray-400 hover:bg-gray-100"
isDisabled={isDisabled}
setIsDisabled={setIsDisabled}
/>
<LinkedInButton
className="rounded-md border-gray-400 hover:bg-gray-100"
isDisabled={isDisabled}
setIsDisabled={setIsDisabled}
/>
</div>
<div className="flex w-full items-center gap-4 py-6 text-sm text-gray-600">
<div className="h-px w-full bg-gray-200" />
OR
<div className="h-px w-full bg-gray-200" />
</div>
<div className="flex flex-row gap-2">
{!isUsingEmail && (
<button
className="flex-grow rounded-md border border-gray-400 px-4 py-2 text-sm text-gray-600 hover:bg-gray-100"
onClick={() => setIsUsingEmail(true)}
>
Use your email address
</button>
)}
{isUsingEmail && (
<>
<button
className="flex-grow rounded-md border border-gray-400 px-4 py-2 text-sm text-gray-600 hover:bg-gray-100"
onClick={() => setEmailNature('login')}
>
Already have an account
</button>
<button
className="flex-grow rounded-md border border-gray-400 px-4 py-2 text-sm text-gray-600 hover:bg-gray-100"
onClick={() => setEmailNature('signup')}
>
Create an account
</button>
</>
)}
</div>
</Modal>
);
}

@ -1,20 +1,27 @@
import { useEffect, useState } from 'react';
import { GitHubIcon } from '../ReactIcons/GitHubIcon.tsx';
import { FIRST_LOGIN_PARAM, setAuthToken } from '../../lib/jwt';
import {
FIRST_LOGIN_PARAM,
COURSE_PURCHASE_PARAM,
setAuthToken,
} from '../../lib/jwt';
import { cn } from '../../../editor/utils/classname.ts';
import { httpGet } from '../../lib/http';
import { Spinner } from '../ReactIcons/Spinner.tsx';
import { CHECKOUT_AFTER_LOGIN_KEY } from './CourseLoginPopup.tsx';
import { triggerUtmRegistration } from '../../lib/browser.ts';
type GitHubButtonProps = {
isDisabled?: boolean;
setIsDisabled?: (isDisabled: boolean) => void;
className?: string;
};
const GITHUB_REDIRECT_AT = 'githubRedirectAt';
const GITHUB_LAST_PAGE = 'githubLastPage';
export function GitHubButton(props: GitHubButtonProps) {
const { isDisabled, setIsDisabled } = props;
const { isDisabled, setIsDisabled, className } = props;
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState('');
@ -48,7 +55,7 @@ export function GitHubButton(props: GitHubButtonProps) {
triggerUtmRegistration();
let redirectUrl = '/';
let redirectUrl = new URL('/', window.location.origin);
const gitHubRedirectAt = localStorage.getItem(GITHUB_REDIRECT_AT);
const lastPageBeforeGithub = localStorage.getItem(GITHUB_LAST_PAGE);
@ -60,25 +67,36 @@ export function GitHubButton(props: GitHubButtonProps) {
const timeSinceRedirect = now - socialRedirectAtTime;
if (timeSinceRedirect < 30 * 1000) {
redirectUrl = lastPageBeforeGithub;
redirectUrl = new URL(lastPageBeforeGithub, window.location.origin);
}
}
const authRedirectUrl = localStorage.getItem('authRedirect');
if (authRedirectUrl) {
localStorage.removeItem('authRedirect');
redirectUrl = authRedirectUrl;
redirectUrl = new URL(authRedirectUrl, window.location.origin);
}
localStorage.removeItem(GITHUB_REDIRECT_AT);
localStorage.removeItem(GITHUB_LAST_PAGE);
setAuthToken(response.token);
const url = new URL(redirectUrl, window.location.origin);
if (response?.isNewUser) {
url.searchParams.set(FIRST_LOGIN_PARAM, '1');
redirectUrl.searchParams.set(FIRST_LOGIN_PARAM, '1');
}
window.location.href = url.toString();
const shouldTriggerPurchase =
localStorage.getItem(CHECKOUT_AFTER_LOGIN_KEY) !== '0';
if (
redirectUrl.pathname.includes('/courses/sql') &&
shouldTriggerPurchase
) {
redirectUrl.searchParams.set(COURSE_PURCHASE_PARAM, '1');
localStorage.removeItem(CHECKOUT_AFTER_LOGIN_KEY);
}
window.location.href = redirectUrl.toString();
})
.catch((err) => {
setError('Something went wrong. Please try again later.');
@ -124,7 +142,10 @@ export function GitHubButton(props: GitHubButtonProps) {
return (
<>
<button
className="inline-flex h-10 w-full items-center justify-center gap-2 rounded border border-slate-300 bg-white p-2 text-sm font-medium text-black outline-none focus:ring-2 focus:ring-[#333] focus:ring-offset-1 disabled:cursor-not-allowed disabled:opacity-60"
className={cn(
'inline-flex h-10 w-full items-center justify-center gap-2 rounded border border-slate-300 bg-white p-2 text-sm font-medium text-black outline-none hover:border-gray-400 hover:bg-gray-50 focus:ring-2 focus:ring-[#333] focus:ring-offset-1 disabled:cursor-not-allowed disabled:opacity-60',
className,
)}
disabled={isLoading || isDisabled}
onClick={handleClick}
>

@ -1,24 +1,32 @@
import { useEffect, useState } from 'react';
import Cookies from 'js-cookie';
import { FIRST_LOGIN_PARAM, TOKEN_COOKIE_NAME, setAuthToken } from '../../lib/jwt';
import {
FIRST_LOGIN_PARAM,
TOKEN_COOKIE_NAME,
setAuthToken,
} from '../../lib/jwt';
import { httpGet } from '../../lib/http';
import { Spinner } from '../ReactIcons/Spinner.tsx';
import { COURSE_PURCHASE_PARAM } from '../../lib/jwt';
import { GoogleIcon } from '../ReactIcons/GoogleIcon.tsx';
import { Spinner } from '../ReactIcons/Spinner.tsx';
import { CHECKOUT_AFTER_LOGIN_KEY } from './CourseLoginPopup.tsx';
import {
getStoredUtmParams,
triggerUtmRegistration,
} from '../../lib/browser.ts';
import { cn } from '../../lib/classname.ts';
type GoogleButtonProps = {
isDisabled?: boolean;
setIsDisabled?: (isDisabled: boolean) => void;
className?: string;
};
const GOOGLE_REDIRECT_AT = 'googleRedirectAt';
const GOOGLE_LAST_PAGE = 'googleLastPage';
export function GoogleButton(props: GoogleButtonProps) {
const { isDisabled, setIsDisabled } = props;
const { isDisabled, setIsDisabled, className } = props;
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState('');
@ -41,8 +49,6 @@ export function GoogleButton(props: GoogleButtonProps) {
}`,
)
.then(({ response, error }) => {
const utmParams = getStoredUtmParams();
if (!response?.token) {
setError(error?.message || 'Something went wrong.');
setIsLoading(false);
@ -53,7 +59,7 @@ export function GoogleButton(props: GoogleButtonProps) {
triggerUtmRegistration();
let redirectUrl = '/';
let redirectUrl = new URL('/', window.location.origin);
const googleRedirectAt = localStorage.getItem(GOOGLE_REDIRECT_AT);
const lastPageBeforeGoogle = localStorage.getItem(GOOGLE_LAST_PAGE);
@ -65,25 +71,36 @@ export function GoogleButton(props: GoogleButtonProps) {
const timeSinceRedirect = now - socialRedirectAtTime;
if (timeSinceRedirect < 30 * 1000) {
redirectUrl = lastPageBeforeGoogle;
redirectUrl = new URL(lastPageBeforeGoogle, window.location.origin);
}
}
const authRedirectUrl = localStorage.getItem('authRedirect');
if (authRedirectUrl) {
localStorage.removeItem('authRedirect');
redirectUrl = authRedirectUrl;
redirectUrl = new URL(authRedirectUrl, window.location.origin);
}
if (response?.isNewUser) {
redirectUrl.searchParams.set(FIRST_LOGIN_PARAM, '1');
}
const shouldTriggerPurchase =
localStorage.getItem(CHECKOUT_AFTER_LOGIN_KEY) !== '0';
if (
redirectUrl.pathname.includes('/courses/sql') &&
shouldTriggerPurchase
) {
redirectUrl.searchParams.set(COURSE_PURCHASE_PARAM, '1');
localStorage.removeItem(CHECKOUT_AFTER_LOGIN_KEY);
}
localStorage.removeItem(GOOGLE_REDIRECT_AT);
localStorage.removeItem(GOOGLE_LAST_PAGE);
setAuthToken(response.token);
const url = new URL(redirectUrl, window.location.origin);
if (response?.isNewUser) {
url.searchParams.set(FIRST_LOGIN_PARAM, '1');
}
window.location.href = url.toString();
window.location.href = redirectUrl.toString();
})
.catch((err) => {
setError('Something went wrong. Please try again later.');
@ -135,7 +152,10 @@ export function GoogleButton(props: GoogleButtonProps) {
return (
<>
<button
className="inline-flex h-10 w-full items-center justify-center gap-2 rounded border border-slate-300 bg-white p-2 text-sm font-medium text-black outline-none focus:ring-2 focus:ring-[#333] focus:ring-offset-1 disabled:cursor-not-allowed disabled:opacity-60"
className={cn(
'inline-flex h-10 w-full items-center justify-center gap-2 rounded border border-slate-300 bg-white p-2 text-sm font-medium text-black outline-none hover:border-gray-400 hover:bg-gray-50 focus:ring-2 focus:ring-[#333] focus:ring-offset-1 disabled:cursor-not-allowed disabled:opacity-60',
className,
)}
disabled={isLoading || isDisabled}
onClick={handleClick}
>

@ -1,21 +1,29 @@
import { useEffect, useState } from 'react';
import Cookies from 'js-cookie';
import { FIRST_LOGIN_PARAM, TOKEN_COOKIE_NAME, setAuthToken } from '../../lib/jwt';
import {
FIRST_LOGIN_PARAM,
COURSE_PURCHASE_PARAM,
TOKEN_COOKIE_NAME,
setAuthToken,
} from '../../lib/jwt';
import { cn } from '../../lib/classname.ts';
import { httpGet } from '../../lib/http';
import { Spinner } from '../ReactIcons/Spinner.tsx';
import { LinkedInIcon } from '../ReactIcons/LinkedInIcon.tsx';
import { Spinner } from '../ReactIcons/Spinner.tsx';
import { CHECKOUT_AFTER_LOGIN_KEY } from './CourseLoginPopup.tsx';
import { triggerUtmRegistration } from '../../lib/browser.ts';
type LinkedInButtonProps = {
isDisabled?: boolean;
setIsDisabled?: (isDisabled: boolean) => void;
className?: string;
};
const LINKEDIN_REDIRECT_AT = 'linkedInRedirectAt';
const LINKEDIN_LAST_PAGE = 'linkedInLastPage';
export function LinkedInButton(props: LinkedInButtonProps) {
const { isDisabled, setIsDisabled } = props;
const { isDisabled, setIsDisabled, className } = props;
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState('');
@ -48,7 +56,7 @@ export function LinkedInButton(props: LinkedInButtonProps) {
triggerUtmRegistration();
let redirectUrl = '/';
let redirectUrl = new URL('/', window.location.origin);
const linkedInRedirectAt = localStorage.getItem(LINKEDIN_REDIRECT_AT);
const lastPageBeforeLinkedIn = localStorage.getItem(LINKEDIN_LAST_PAGE);
@ -60,25 +68,38 @@ export function LinkedInButton(props: LinkedInButtonProps) {
const timeSinceRedirect = now - socialRedirectAtTime;
if (timeSinceRedirect < 30 * 1000) {
redirectUrl = lastPageBeforeLinkedIn;
redirectUrl = new URL(
lastPageBeforeLinkedIn,
window.location.origin,
);
}
}
const authRedirectUrl = localStorage.getItem('authRedirect');
if (authRedirectUrl) {
localStorage.removeItem('authRedirect');
redirectUrl = authRedirectUrl;
redirectUrl = new URL(authRedirectUrl, window.location.origin);
}
if (response?.isNewUser) {
redirectUrl.searchParams.set(FIRST_LOGIN_PARAM, '1');
}
const shouldTriggerPurchase =
localStorage.getItem(CHECKOUT_AFTER_LOGIN_KEY) !== '0';
if (
redirectUrl.pathname.includes('/courses/sql') &&
shouldTriggerPurchase
) {
redirectUrl.searchParams.set(COURSE_PURCHASE_PARAM, '1');
localStorage.removeItem(CHECKOUT_AFTER_LOGIN_KEY);
}
localStorage.removeItem(LINKEDIN_REDIRECT_AT);
localStorage.removeItem(LINKEDIN_LAST_PAGE);
setAuthToken(response.token);
const url = new URL(redirectUrl, window.location.origin);
if (response?.isNewUser) {
url.searchParams.set(FIRST_LOGIN_PARAM, '1');
}
window.location.href = url.toString();
window.location.href = redirectUrl.toString();
})
.catch((err) => {
setError('Something went wrong. Please try again later.');
@ -130,14 +151,17 @@ export function LinkedInButton(props: LinkedInButtonProps) {
return (
<>
<button
className="inline-flex h-10 w-full items-center justify-center gap-2 rounded border border-slate-300 bg-white p-2 text-sm font-medium text-black outline-none focus:ring-2 focus:ring-[#333] focus:ring-offset-1 disabled:cursor-not-allowed disabled:opacity-60"
className={cn(
'inline-flex h-10 w-full items-center justify-center gap-2 rounded border border-slate-300 bg-white p-2 text-sm font-medium text-black outline-none hover:border-gray-400 focus:ring-2 focus:ring-[#333] focus:ring-offset-1 disabled:cursor-not-allowed disabled:opacity-60',
className,
)}
disabled={isLoading || isDisabled}
onClick={handleClick}
>
{isLoading ? (
<Spinner className={'h-[18px] w-[18px]'} isDualRing={false} />
) : (
<LinkedInIcon className={'h-[18px] w-[18px]'} />
<LinkedInIcon className={'h-[18px] w-[18px] text-blue-700'} />
)}
Continue with LinkedIn
</button>

@ -11,6 +11,7 @@ import type {
} from '../CustomRoadmap/CreateRoadmap/CreateRoadmapModal';
import { CreateRoadmapModal } from '../CustomRoadmap/CreateRoadmap/CreateRoadmapModal';
import { useToast } from '../../hooks/use-toast';
import { ContentConfirmationModal } from './ContentConfirmationModal';
export type TeamResourceConfig = {
isCustomResource: boolean;
@ -44,6 +45,7 @@ export function RoadmapSelector(props: RoadmapSelectorProps) {
const [isCreatingRoadmap, setIsCreatingRoadmap] = useState<boolean>(false);
const [error, setError] = useState<string>('');
const [confirmationContentId, setConfirmationContentId] = useState<string>();
async function loadAllRoadmaps() {
const { error, response } = await httpGet<PageType[]>(`/pages.json`);
@ -101,7 +103,7 @@ export function RoadmapSelector(props: RoadmapSelectorProps) {
});
}
async function addTeamResource(roadmapId: string) {
async function addTeamResource(roadmapId: string, shouldCopyContent = false) {
if (!teamId) {
return;
}
@ -118,6 +120,7 @@ export function RoadmapSelector(props: RoadmapSelectorProps) {
resourceType: 'roadmap',
removed: [],
renderer: renderer || 'balsamiq',
shouldCopyContent,
},
);
@ -148,8 +151,24 @@ export function RoadmapSelector(props: RoadmapSelectorProps) {
});
}
const confirmationContentIdModal = confirmationContentId && (
<ContentConfirmationModal
onClose={() => {
setConfirmationContentId('');
}}
onClick={(shouldCopy) => {
addTeamResource(confirmationContentId, shouldCopy).finally(() => {
pageProgressMessage.set('');
setConfirmationContentId('');
});
}}
/>
);
return (
<div>
{confirmationContentIdModal}
{changingRoadmapId && (
<UpdateTeamResourceModal
onClose={() => setChangingRoadmapId('')}
@ -170,9 +189,20 @@ export function RoadmapSelector(props: RoadmapSelectorProps) {
allRoadmaps={allRoadmaps.filter((r) => r.renderer === 'editor')}
teamId={teamId}
onRoadmapAdd={(roadmapId) => {
addTeamResource(roadmapId).finally(() => {
pageProgressMessage.set('');
});
const isEditorRoadmap = allRoadmaps.find(
(r) => r.id === roadmapId && r.renderer === 'editor',
);
if (!isEditorRoadmap) {
addTeamResource(roadmapId).finally(() => {
pageProgressMessage.set('');
});
return;
}
setShowSelectRoadmapModal(false);
setConfirmationContentId(roadmapId);
}}
onRoadmapRemove={(roadmapId) => {
onRemove(roadmapId).finally(() => {});

@ -1,11 +1,9 @@
---
import { Menu } from 'lucide-react';
import { AccountStreak } from '../AccountStreak/AccountStreak';
import Icon from '../AstroIcon.astro';
import { NavigationDropdown } from '../NavigationDropdown';
import { AccountDropdown } from './AccountDropdown';
import NewIndicator from './NewIndicator.astro';
import { AccountStreak } from '../AccountStreak/AccountStreak';
import { RoadmapDropdownMenu } from '../RoadmapDropdownMenu/RoadmapDropdownMenu';
import { AccountDropdown } from './AccountDropdown';
---
<div class='bg-slate-900 py-5 text-white sm:py-8'>
@ -48,7 +46,7 @@ import { RoadmapDropdownMenu } from '../RoadmapDropdownMenu/RoadmapDropdownMenu'
</a>
<a
href='/changelog'
class='group relative text-blue-300 hover:text-white hidden md:block ml-0.5'
class='group relative ml-0.5 hidden text-blue-300 hover:text-white md:block'
>
Changelog

@ -0,0 +1,28 @@
import type { SVGProps } from 'react';
type RoadmapLogoIconProps = SVGProps<SVGSVGElement> & {
color?: 'white' | 'black';
};
export function RoadmapLogoIcon(props: RoadmapLogoIconProps) {
const { color = 'white', ...rest } = props;
return (
<svg
xmlns="http://www.w3.org/2000/svg"
width="30"
height="30"
viewBox="0 0 283 283"
{...rest}
>
<path
fill={color === 'black' ? '#000' : '#fff'}
d="M0 39C0 17.46 17.46 0 39 0h205c21.539 0 39 17.46 39 39v205c0 21.539-17.461 39-39 39H39c-21.54 0-39-17.461-39-39V39Z"
/>
<path
fill={color === 'black' ? '#fff' : '#000'}
d="M121.215 210.72c-1.867.56-4.854 1.12-8.96 1.68-3.92.56-8.027.84-12.32.84-4.107 0-7.84-.28-11.2-.84-3.174-.56-5.88-1.68-8.12-3.36s-4.014-3.92-5.32-6.72c-1.12-2.987-1.68-6.813-1.68-11.48v-84c0-4.293.746-7.933 2.24-10.92 1.68-3.173 4.013-5.973 7-8.4s6.626-4.573 10.92-6.44c4.48-2.053 9.24-3.827 14.28-5.32a106.176 106.176 0 0 1 15.68-3.36 95.412 95.412 0 0 1 16.24-1.4c8.96 0 16.053 1.773 21.28 5.32 5.226 3.36 7.84 8.96 7.84 16.8 0 2.613-.374 5.227-1.12 7.84-.747 2.427-1.68 4.667-2.8 6.72a133.1 133.1 0 0 0-12.04.56c-4.107.373-8.12.933-12.04 1.68s-7.654 1.587-11.2 2.52c-3.36.747-6.254 1.68-8.68 2.8v95.48zm45.172-22.4c0-7.84 2.426-14.373 7.28-19.6s11.48-7.84 19.88-7.84 15.026 2.613 19.88 7.84 7.28 11.76 7.28 19.6-2.427 14.373-7.28 19.6-11.48 7.84-19.88 7.84-15.027-2.613-19.88-7.84-7.28-11.76-7.28-19.6z"
/>
</svg>
);
}

@ -17,7 +17,7 @@ const links = [
isHighlighted: true,
},
{
link: '/ai/explore',
link: '/ai',
label: 'AI Roadmaps',
description: 'Generate roadmaps with AI',
Icon: Sparkles,

@ -0,0 +1,67 @@
import { useQuery } from '@tanstack/react-query';
import { useEffect, useState } from 'react';
import { isLoggedIn } from '../../lib/jwt';
import {
courseProgressOptions
} from '../../queries/course-progress';
import { queryClient } from '../../stores/query-client';
import { CourseLoginPopup } from '../AuthenticationFlow/CourseLoginPopup';
import { BuyButton, COURSE_SLUG } from './BuyButton';
export function AccountButton() {
const [isVisible, setIsVisible] = useState(false);
const [showLoginModal, setShowLoginModal] = useState(false);
const { data: courseProgress, isLoading: isLoadingCourseProgress } = useQuery(
courseProgressOptions(COURSE_SLUG),
queryClient,
);
useEffect(() => {
setIsVisible(true);
}, []);
const buttonClasses =
'rounded-full px-5 py-2 text-base font-medium text-yellow-700 hover:text-yellow-500 transition-colors';
const hasEnrolled = !!courseProgress?.enrolledAt;
const loginModal = (
<CourseLoginPopup
checkoutAfterLogin={false}
onClose={() => {
setShowLoginModal(false);
}}
/>
);
if (!isVisible || isLoadingCourseProgress) {
return <button className={`${buttonClasses} opacity-0`}>...</button>;
}
if (!isLoggedIn()) {
return (
<>
<button
onClick={() => setShowLoginModal(true)}
className={`${buttonClasses} animate-fade-in`}
>
Login
</button>
{showLoginModal && loginModal}
</>
);
}
if (!hasEnrolled) {
return <BuyButton variant="top-nav" />;
}
return (
<a
href={`${import.meta.env.PUBLIC_COURSE_APP_URL}/sql`}
className={`${buttonClasses} animate-fade-in`}
>
Start Learning
</a>
);
}

@ -0,0 +1,213 @@
import { useMutation, useQuery } from '@tanstack/react-query';
import { ArrowRightIcon } from 'lucide-react';
import { useEffect, useState } from 'react';
import { cn } from '../../lib/classname';
import { COURSE_PURCHASE_PARAM, isLoggedIn } from '../../lib/jwt';
import { coursePriceOptions } from '../../queries/billing';
import { courseProgressOptions } from '../../queries/course-progress';
import { queryClient } from '../../stores/query-client';
import { CourseLoginPopup } from '../AuthenticationFlow/CourseLoginPopup';
import { useToast } from '../../hooks/use-toast';
import { httpPost } from '../../lib/query-http';
import { deleteUrlParam, getUrlParams } from '../../lib/browser';
export const COURSE_SLUG = 'master-sql';
type CreateCheckoutSessionBody = {
courseId: string;
success?: string;
cancel?: string;
};
type CreateCheckoutSessionResponse = {
checkoutUrl: string;
};
type BuyButtonProps = {
variant?: 'main' | 'floating' | 'top-nav';
};
export function BuyButton(props: BuyButtonProps) {
const { variant = 'main' } = props;
const [isLoginPopupOpen, setIsLoginPopupOpen] = useState(false);
const toast = useToast();
const { data: coursePricing, isLoading: isLoadingCourse } = useQuery(
coursePriceOptions({ courseSlug: COURSE_SLUG }),
queryClient,
);
const { data: courseProgress, isLoading: isLoadingCourseProgress } = useQuery(
courseProgressOptions(COURSE_SLUG),
queryClient,
);
const {
mutate: createCheckoutSession,
isPending: isCreatingCheckoutSession,
} = useMutation(
{
mutationFn: (body: CreateCheckoutSessionBody) => {
return httpPost<CreateCheckoutSessionResponse>(
'/v1-create-checkout-session',
body,
);
},
onMutate: () => {
toast.loading('Creating checkout session...');
},
onSuccess: (data) => {
window.location.href = data.checkoutUrl;
},
onError: (error) => {
console.error(error);
toast.error(error?.message || 'Failed to create checkout session');
},
},
queryClient,
);
useEffect(() => {
const urlParams = getUrlParams();
const shouldTriggerPurchase = urlParams[COURSE_PURCHASE_PARAM] === '1';
if (shouldTriggerPurchase) {
deleteUrlParam(COURSE_PURCHASE_PARAM);
initPurchase();
}
}, []);
const isLoadingPricing =
isLoadingCourse || !coursePricing || isLoadingCourseProgress;
const isAlreadyEnrolled = !!courseProgress?.enrolledAt;
function initPurchase() {
if (!isLoggedIn()) {
return;
}
createCheckoutSession({
courseId: COURSE_SLUG,
success: `/courses/sql?e=1`,
cancel: `/courses/sql`,
});
}
function onBuyClick() {
if (!isLoggedIn()) {
setIsLoginPopupOpen(true);
return;
}
const hasEnrolled = !!courseProgress?.enrolledAt;
if (hasEnrolled) {
window.location.href = `${import.meta.env.PUBLIC_COURSE_APP_URL}/sql`;
return;
}
initPurchase();
}
const courseLoginPopup = isLoginPopupOpen && (
<CourseLoginPopup onClose={() => setIsLoginPopupOpen(false)} />
);
if (variant === 'main') {
return (
<div className="relative flex w-full flex-col items-center gap-2 md:w-auto">
{courseLoginPopup}
<button
onClick={onBuyClick}
disabled={isLoadingPricing}
className={cn(
'group relative inline-flex w-full min-w-[235px] items-center justify-center overflow-hidden rounded-xl bg-gradient-to-r from-yellow-500 to-yellow-300 px-8 py-3 text-base font-semibold text-black transition-all duration-300 ease-out hover:scale-[1.02] hover:shadow-[0_0_30px_rgba(234,179,8,0.4)] focus:outline-none active:ring-0 md:w-auto md:rounded-full md:text-lg',
(isLoadingPricing || isCreatingCheckoutSession) &&
'striped-loader-yellow pointer-events-none scale-105 bg-yellow-500',
)}
>
{isLoadingPricing ? (
<span className="relative flex items-center gap-2">&nbsp;</span>
) : isAlreadyEnrolled ? (
<span className="relative flex items-center gap-2">
Start Learning
</span>
) : (
<span className="relative flex items-center gap-2">
{coursePricing?.isEligibleForDiscount && coursePricing?.flag} Buy
now for{' '}
{coursePricing?.isEligibleForDiscount ? (
<span className="flex items-center gap-2">
<span className="hidden text-base line-through opacity-75 md:inline">
${coursePricing?.fullPrice}
</span>
<span className="text-base md:text-xl">
${coursePricing?.regionalPrice}
</span>
</span>
) : (
<span>${coursePricing?.regionalPrice}</span>
)}
<ArrowRightIcon className="h-5 w-5 transition-transform duration-300 ease-out group-hover:translate-x-1" />
</span>
)}
</button>
{!isLoadingPricing &&
!isAlreadyEnrolled &&
coursePricing?.isEligibleForDiscount && (
<span className="absolute top-full translate-y-2.5 text-sm text-yellow-400">
{coursePricing.regionalDiscountPercentage}% regional discount
applied
</span>
)}
</div>
);
}
if (variant === 'top-nav') {
return (
<button
onClick={onBuyClick}
disabled={isLoadingPricing}
className={`animate-fade-in rounded-full px-5 py-2 text-base font-medium text-yellow-700 transition-colors hover:text-yellow-500`}
>
Purchase Course
</button>
);
}
return (
<div className="relative flex flex-col items-center gap-2">
{courseLoginPopup}
<button
onClick={onBuyClick}
disabled={isLoadingPricing}
className={cn(
'group relative inline-flex min-w-[220px] items-center justify-center overflow-hidden rounded-full bg-gradient-to-r from-yellow-500 to-yellow-300 px-8 py-2 font-medium text-black transition-all duration-300 ease-out hover:scale-[1.02] hover:shadow-[0_0_30px_rgba(234,179,8,0.4)] focus:outline-none',
(isLoadingPricing || isCreatingCheckoutSession) &&
'striped-loader-yellow pointer-events-none bg-yellow-500',
)}
>
{isLoadingPricing ? (
<span className="relative flex items-center gap-2">&nbsp;</span>
) : isAlreadyEnrolled ? (
<span className="relative flex items-center gap-2">
Start Learning
</span>
) : (
<span className="relative flex items-center gap-2">
{coursePricing?.flag && coursePricing.isEligibleForDiscount
? coursePricing.flag
: null}{' '}
Buy Now ${coursePricing?.regionalPrice}
<ArrowRightIcon className="h-5 w-5 transition-transform duration-300 ease-out group-hover:translate-x-1" />
</span>
)}
</button>
{!isAlreadyEnrolled && coursePricing?.isEligibleForDiscount && (
<span className="top-full text-sm text-yellow-400">
{coursePricing.regionalDiscountPercentage}% regional discount applied
</span>
)}
</div>
);
}

@ -0,0 +1,145 @@
import { ChevronDown, BookIcon, CodeIcon, FileQuestion, MessageCircleQuestionIcon, CircleDot } from 'lucide-react';
import { cn } from '../../lib/classname';
import { useEffect, useState } from 'react';
type ChapterRowProps = {
counter: number;
icon: React.ReactNode;
title: string;
description: string;
lessonCount: number;
challengeCount: number;
isExpandable?: boolean;
className?: string;
lessons?: { title: string; type: 'lesson' | 'challenge' | 'quiz' }[];
};
export function ChapterRow(props: ChapterRowProps) {
const {
counter,
icon,
title,
description,
lessonCount,
challengeCount,
isExpandable = true,
className,
lessons = [],
} = props;
const [isExpanded, setIsExpanded] = useState(true);
const regularLessons = lessons.filter((l) => l.type === 'lesson');
const challenges = lessons.filter((l) =>
['challenge', 'quiz'].includes(l.type),
);
useEffect(() => {
const isMobile = window.innerWidth < 768;
setIsExpanded(!isMobile);
}, []);
return (
<div
className={cn('group relative select-none overflow-hidden', className)}
>
<div
role="button"
onClick={() => isExpandable && setIsExpanded(!isExpanded)}
className={cn(
'relative rounded-xl border border-zinc-800 bg-zinc-800 p-6',
'bg-gradient-to-br from-zinc-900/90 via-zinc-900/70 to-zinc-900/50',
!isExpanded &&
'hover:bg-gradient-to-br hover:from-zinc-900/95 hover:via-zinc-900/80 hover:to-zinc-900/60',
!isExpanded &&
'hover:cursor-pointer hover:shadow-[0_0_30px_rgba(0,0,0,0.2)]',
isExpanded && 'rounded-b-none border-b-0',
)}
>
<div className="flex items-start gap-4">
<div className="hidden flex-shrink-0 md:block">
<div className="rounded-full bg-yellow-500/10 p-3">{icon}</div>
</div>
<div className="flex-grow">
<h3 className="text-xl font-semibold tracking-wide text-white">
<span className="inline text-gray-500 md:hidden">
{counter}.{' '}
</span>
{title}
</h3>
<p className="mt-2 text-zinc-400">{description}</p>
<div className="mt-4 flex items-center gap-4">
<div className="flex items-center gap-2 text-sm text-zinc-500">
<span>{lessonCount} Lessons</span>
</div>
<div className="flex items-center gap-2 text-sm text-zinc-500">
<span>{challengeCount} Challenges</span>
</div>
</div>
</div>
{isExpandable && (
<div className="flex-shrink-0 rounded-full bg-zinc-800/80 p-2 text-zinc-400 group-hover:bg-zinc-800 group-hover:text-yellow-500">
<ChevronDown
className={cn(
'h-4 w-4 transition-transform',
isExpanded ? 'rotate-180' : '',
)}
/>
</div>
)}
</div>
</div>
{isExpanded && (
<div className="rounded-b-xl border border-t-0 border-zinc-800 bg-gradient-to-br from-zinc-900/50 via-zinc-900/30 to-zinc-900/20">
<div className="grid grid-cols-1 divide-zinc-800 md:grid-cols-2 md:divide-x">
{regularLessons.length > 0 && (
<div className="p-6 pb-0 md:pb-6">
<h4 className="mb-4 text-sm font-medium uppercase tracking-wider text-zinc-500">
Lessons
</h4>
<div className="space-y-3">
{regularLessons.map((lesson, index) => (
<div
key={index}
className="flex items-center gap-3 text-zinc-400 hover:text-yellow-500"
>
<BookIcon className="h-4 w-4" />
<span>{lesson.title}</span>
</div>
))}
</div>
</div>
)}
{challenges.length > 0 && (
<div className="p-6">
<h4 className="mb-4 text-sm font-medium uppercase tracking-wider text-zinc-500">
Exercises
</h4>
<div className="space-y-3">
{challenges.map((challenge, index) => (
<div
key={index}
className="flex items-center gap-3 text-zinc-400 hover:text-yellow-500"
>
{challenge.type === 'challenge' ? (
<CodeIcon className="h-4 w-4" />
) : (
<CircleDot className="h-4 w-4" />
)}
<span>{challenge.title}</span>
</div>
))}
</div>
</div>
)}
</div>
</div>
)}
</div>
);
}

@ -0,0 +1,24 @@
export function CourseAuthor() {
return (
<div className="mt-8 w-full max-w-3xl space-y-4">
<div className="flex flex-row items-center gap-5">
<img
src="https://github.com/kamranahmedse.png"
className="size-12 rounded-full bg-yellow-500/10 md:size-16"
/>
<a
href="https://twitter.com/kamrify"
target="_blank"
className="flex flex-col"
>
<span className="text-lg font-medium text-zinc-200 md:text-2xl">
Kamran Ahmed
</span>
<span className="text-sm text-zinc-500 md:text-lg">
Software Engineer
</span>
</a>
</div>
</div>
);
}

@ -0,0 +1,94 @@
import { MinusIcon, PlusIcon, type LucideIcon } from 'lucide-react';
import { useEffect, useState } from 'react';
import { cn } from '../../lib/classname';
type CourseFeatureProps = {
title: string;
icon: LucideIcon;
description: string;
imgUrl?: string;
};
export function CourseFeature(props: CourseFeatureProps) {
const { title, icon: Icon, description, imgUrl } = props;
const [isExpanded, setIsExpanded] = useState(false);
const [isZoomed, setIsZoomed] = useState(false);
useEffect(() => {
function onScroll() {
if (isZoomed) {
setIsZoomed(false);
}
}
window.addEventListener('scroll', onScroll);
return () => window.removeEventListener('scroll', onScroll);
}, [isZoomed]);
return (
<>
{isZoomed && (
<div
onClick={() => {
setIsZoomed(false);
setIsExpanded(false);
}}
className="fixed inset-0 z-[999] flex cursor-zoom-out items-center justify-center bg-black bg-opacity-75"
>
<img
src={imgUrl}
alt={title}
className="max-h-[50%] max-w-[90%] rounded-xl object-contain"
/>
</div>
)}
<div
className={cn(
'fixed inset-0 z-10 bg-black/70 opacity-100 transition-opacity duration-200 ease-out',
{
'pointer-events-none opacity-0': !isExpanded,
},
)}
onClick={() => setIsExpanded(false)}
></div>
<div className="relative">
<button
onClick={() => setIsExpanded(!isExpanded)}
className={cn(
'z-20 flex w-full items-center rounded-lg border border-zinc-800 bg-zinc-900/50 px-4 py-3 text-left transition-colors duration-200 ease-out hover:bg-zinc-800/40',
{
'relative bg-zinc-800 hover:bg-zinc-800': isExpanded,
},
)}
>
<span className="flex flex-grow items-center space-x-3">
<Icon />
<span>{title}</span>
</span>
{isExpanded ? (
<MinusIcon className="h-4 w-4" />
) : (
<PlusIcon className="h-4 w-4" />
)}
</button>
{isExpanded && (
<div className="absolute left-0 top-full z-20 translate-y-2 rounded-lg border border-zinc-800 bg-zinc-800 p-4">
<p>{description}</p>
{imgUrl && (
<img
onClick={() => {
setIsZoomed(true);
setIsExpanded(false);
}}
src={imgUrl}
alt={title}
className="mt-4 h-auto pointer-events-none md:pointer-events-auto w-full cursor-zoom-in rounded-lg object-right-top"
/>
)}
</div>
)}
</div>
</>
);
}

@ -0,0 +1,113 @@
import { ChevronDownIcon } from 'lucide-react';
import { useState } from 'react';
import { SectionHeader } from './SectionHeader';
type FAQItem = {
question: string;
answer: string;
};
function FAQRow({ question, answer }: FAQItem) {
const [isExpanded, setIsExpanded] = useState(false);
return (
<div className="rounded-lg border border-zinc-800 bg-zinc-900">
<button
onClick={() => setIsExpanded(!isExpanded)}
className="flex w-full items-center justify-between p-4 md:p-6 text-left gap-2"
>
<h3 className="text-lg md:text-xl text-balance font-normal text-white">{question}</h3>
<ChevronDownIcon
className={`h-5 w-5 text-zinc-400 transition-transform duration-200 ${
isExpanded ? 'rotate-180' : ''
}`}
/>
</button>
{isExpanded && (
<div className="border-t border-zinc-800 p-6 pt-4 text-base md:text-lg leading-relaxed">
<p>{answer}</p>
</div>
)}
</div>
);
}
export function FAQSection() {
const faqs: FAQItem[] = [
{
question: 'What is the format of the course?',
answer:
'The course is written in textual format. There are several chapters; each chapter has a set of lessons, followed by a set of practice problems and quizzes. You can learn at your own pace and revisit the content anytime.',
},
{
question: 'What prerequisites do I need for this course?',
answer:
'No prior SQL knowledge is required. The course starts from the basics and gradually progresses to advanced topics.',
},
{
question: 'Do I need to have a local database to follow the course?',
answer:
'No, we have an integrated coding playground, populated with a sample databases depending on the lesson, that you can use to follow the course. You can also use your own database if you have one.',
},
{
question: 'How long do I have access to the course?',
answer:
'You get lifetime access to the course including all future updates. Once you purchase, you can learn at your own pace and revisit the content anytime.',
},
{
question: 'What kind of support is available?',
answer:
'You get access to an AI tutor within the course that can help you with queries 24/7. Additionally, you can use the community forums to discuss problems and get help from other learners.',
},
{
question: 'Will I get a certificate upon completion?',
answer:
"Yes, upon completing the course and its challenges, you'll receive a certificate of completion that you can share with employers or add to your LinkedIn profile.",
},
{
question: 'Can I use this for job interviews?',
answer:
'Absolutely! The course covers common SQL interview topics and includes practical challenges similar to what you might face in technical interviews. The hands-on experience will prepare you well for real-world scenarios.',
},
{
question: "What if I don't like the course?",
answer:
'I will refund your purchase within 7 days of the purchase. No questions asked. However, I would love to hear your feedback so that I can improve the course. Send me an email at kamran@roadmap.sh',
},
{
question: 'I already know SQL, can I still take this course?',
answer:
'Yes! The course starts from the basics and gradually progresses to advanced topics. You can skip the chapters that you already know and focus on the ones that you need.',
},
{
question: 'Do you offer any team licenses?',
answer: 'Yes, please contact me at kamran@roadmap.sh',
},
{
question: 'How can I gift this course to someone?',
answer:
'Please contact me at kamran@roadmap.sh and I will be happy to help you.',
},
{
question: 'What if I have a question that is not answered here?',
answer:
'Please contact me at kamran@roadmap.sh and I will be happy to help you.',
},
];
return (
<>
<SectionHeader
title="Frequently Asked Questions"
description="Find answers to common questions about the course below."
className="mt-10 md:mt-24"
/>
<div className="mt-6 md:mt-8 w-full max-w-3xl space-y-2 md:space-y-6">
{faqs.map((faq, index) => (
<FAQRow key={index} {...faq} />
))}
</div>
</>
);
}

@ -0,0 +1,56 @@
import { ArrowRightIcon } from 'lucide-react';
import { useEffect, useState } from 'react';
import { cn } from '../../lib/classname';
import { BuyButton } from './BuyButton';
export function FloatingPurchase() {
const [isHidden, setIsHidden] = useState(true);
useEffect(() => {
function onScroll() {
setIsHidden(window.scrollY < 400);
}
window.addEventListener('scroll', onScroll);
return () => window.removeEventListener('scroll', onScroll);
}, []);
return (
<div
className={cn(
'fixed bottom-0 left-0 right-0 z-[5] flex items-center justify-center transition-all duration-200 ease-out',
{
'pointer-events-none -bottom-10 opacity-0': isHidden,
},
)}
>
{/* Desktop version */}
<div className="hidden mb-5 md:flex w-full max-w-[800px] items-center justify-between rounded-2xl bg-yellow-950 p-5 shadow-lg ring-1 ring-yellow-500/40">
<div className="flex flex-col">
<h2 className="mb-1 text-xl font-medium text-white">
Go from Zero to Hero in SQL
</h2>
<p className="text-sm text-zinc-400">
Get instant access to the course and start learning today
</p>
</div>
<BuyButton variant="floating" />
</div>
{/* Mobile version */}
<div className="flex md:hidden w-full flex-col bg-yellow-950 px-4 pt-3 pb-4 shadow-lg ring-1 ring-yellow-500/40">
<div className="flex flex-col items-center text-center mb-3">
<h2 className="text-lg font-medium text-white">
Master SQL Today
</h2>
<p className="text-xs text-zinc-400">
Get instant lifetime access
</p>
</div>
<BuyButton variant="floating" />
</div>
</div>
);
}

@ -0,0 +1,415 @@
import {
ArrowRightIcon,
ArrowUpDownIcon,
BarChartIcon,
BrainIcon,
ClipboardIcon,
CodeIcon,
DatabaseIcon,
Eye,
FileCheckIcon,
FileQuestionIcon,
GitBranchIcon,
GitMergeIcon,
LayersIcon,
TableIcon,
WrenchIcon,
} from 'lucide-react';
import { ChapterRow } from './ChapterRow';
import { CourseFeature } from './CourseFeature';
import { SectionHeader } from './SectionHeader';
import { Spotlight } from './Spotlight';
import { FloatingPurchase } from './FloatingPurchase';
import { CourseAuthor } from './CourseAuthor';
import { FAQSection } from './FAQSection';
import { BuyButton } from './BuyButton';
import { AccountButton } from './AccountButton';
import { RoadmapLogoIcon } from '../ReactIcons/RoadmapLogo';
type ChapterData = {
icon: React.ReactNode;
title: string;
description: string;
lessonCount: number;
challengeCount: number;
lessons: { title: string; type: 'lesson' | 'challenge' | 'quiz' }[];
};
export function SQLCoursePage() {
const chapters: ChapterData[] = [
{
icon: <DatabaseIcon className="h-6 w-6 text-yellow-500" />,
title: 'Introduction',
description:
'Get comfortable with database concepts and SQL fundamentals.',
lessonCount: 4,
challengeCount: 1,
lessons: [
{ title: 'Basics of Databases', type: 'lesson' },
{ title: 'What is SQL?', type: 'lesson' },
{ title: 'Types of Queries', type: 'lesson' },
{ title: 'Next Steps', type: 'lesson' },
{ title: 'Introduction Quiz', type: 'challenge' },
],
},
{
icon: <TableIcon className="h-6 w-6 text-yellow-500" />,
title: 'SQL Basics',
description: 'Master the essential SQL query operations and syntax.',
lessonCount: 9,
challengeCount: 7,
lessons: [
{ title: 'SELECT Fundamentals', type: 'lesson' },
{ title: 'Aliases and Constants', type: 'lesson' },
{ title: 'Expressions in SELECT', type: 'lesson' },
{ title: 'Selecting DISTINCT Values', type: 'lesson' },
{ title: 'Filtering with WHERE', type: 'lesson' },
{ title: 'Sorting with ORDER BY', type: 'lesson' },
{ title: 'Limiting Results with LIMIT', type: 'lesson' },
{ title: 'Handling NULL Values', type: 'lesson' },
{ title: 'Comments', type: 'lesson' },
{ title: 'Basic Queries Quiz', type: 'quiz' },
{ title: 'Projection Challenge', type: 'challenge' },
{ title: 'Select Expression', type: 'challenge' },
{ title: 'Select Unique', type: 'challenge' },
{ title: 'Logical Operators', type: 'challenge' },
{ title: 'Sorting Challenge', type: 'challenge' },
{ title: 'Sorting and Limiting', type: 'challenge' },
{ title: 'Sorting and Filtering', type: 'challenge' },
],
},
{
icon: <CodeIcon className="h-6 w-6 text-yellow-500" />,
title: 'Manipulating Data',
description: 'Learn how to modify and manipulate data in your database.',
lessonCount: 3,
challengeCount: 3,
lessons: [
{ title: 'INSERT Operations', type: 'lesson' },
{ title: 'UPDATE Operations', type: 'lesson' },
{ title: 'DELETE Operations', type: 'lesson' },
{ title: 'Data Manipulation Quiz', type: 'quiz' },
{ title: 'Inserting Customers', type: 'challenge' },
{ title: 'Updating Bookstore', type: 'challenge' },
{ title: 'Deleting Books', type: 'challenge' },
],
},
{
icon: <LayersIcon className="h-6 w-6 text-yellow-500" />,
title: 'Defining Tables',
description: 'Master database schema design and table management.',
lessonCount: 9,
challengeCount: 7,
lessons: [
{ title: 'Creating Tables', type: 'lesson' },
{ title: 'Data Types in SQLite', type: 'lesson' },
{ title: 'Common Data Types', type: 'lesson' },
{ title: 'More on Numeric Types', type: 'lesson' },
{ title: 'Temporal Data Types', type: 'lesson' },
{ title: 'CHECK Constraints', type: 'lesson' },
{ title: 'Primary Key Constraint', type: 'lesson' },
{ title: 'Modifying Tables', type: 'lesson' },
{ title: 'Dropping and Truncating', type: 'lesson' },
{ title: 'Defining Tables Quiz', type: 'quiz' },
{ title: 'Simple Table Creation', type: 'challenge' },
{ title: 'Data Types Challenge', type: 'challenge' },
{ title: 'Constraints Challenge', type: 'challenge' },
{ title: 'Temporal Validation', type: 'challenge' },
{ title: 'Sales Data Analysis', type: 'challenge' },
{ title: 'Modifying Tables', type: 'challenge' },
{ title: 'Removing Table Data', type: 'challenge' },
],
},
{
icon: <GitMergeIcon className="h-6 w-6 text-yellow-500" />,
title: 'Multi-Table Queries',
description:
'Learn to work with multiple tables using JOINs and relationships.',
lessonCount: 7,
challengeCount: 10,
lessons: [
{ title: 'More on Relational Data', type: 'lesson' },
{ title: 'Relationships and Types', type: 'lesson' },
{ title: 'JOINs in Queries', type: 'lesson' },
{ title: 'Self Joins and Usecases', type: 'lesson' },
{ title: 'Foreign Key Constraint', type: 'lesson' },
{ title: 'Set Operator Queries', type: 'lesson' },
{ title: 'Views and Virtual Tables', type: 'lesson' },
{ title: 'Multi-Table Queries Quiz', type: 'quiz' },
{ title: 'Inactive Customers', type: 'challenge' },
{ title: 'Recent 3 Orders', type: 'challenge' },
{ title: 'High Value Orders', type: 'challenge' },
{ title: 'Specific Book Customers', type: 'challenge' },
{ title: 'Referred Customers', type: 'challenge' },
{ title: 'Readers Like You', type: 'challenge' },
{ title: 'Same Price Books', type: 'challenge' },
{ title: 'Multi-Section Authors', type: 'challenge' },
{ title: 'Expensive Books', type: 'challenge' },
{ title: 'Trending Tech Books', type: 'challenge' },
],
},
{
icon: <WrenchIcon className="h-6 w-6 text-yellow-500" />,
title: 'Aggregate Functions',
description:
"Analyze and summarize data using SQL's powerful aggregation features.",
lessonCount: 4,
challengeCount: 10,
lessons: [
{ title: 'What is Aggregation?', type: 'lesson' },
{ title: 'Basic Aggregation', type: 'lesson' },
{ title: 'Grouping Data', type: 'lesson' },
{ title: 'Grouping and Filtering', type: 'lesson' },
{ title: 'Aggregate Queries Quiz', type: 'quiz' },
{ title: 'Book Sales Summary', type: 'challenge' },
{ title: 'Category Insights', type: 'challenge' },
{ title: 'Author Tier Analysis', type: 'challenge' },
{ title: 'Author Book Stats', type: 'challenge' },
{ title: 'Daily Sales Report', type: 'challenge' },
{ title: 'Publisher Stats', type: 'challenge' },
{ title: 'High Value Publishers', type: 'challenge' },
{ title: 'Premium Authors', type: 'challenge' },
{ title: 'Sales Analysis', type: 'challenge' },
{ title: 'Employee Performance', type: 'challenge' },
],
},
{
icon: <BarChartIcon className="h-6 w-6 text-yellow-500" />,
title: 'Scalar Functions',
description:
'Master built-in functions for data transformation and manipulation.',
lessonCount: 6,
challengeCount: 5,
lessons: [
{ title: 'What are they?', type: 'lesson' },
{ title: 'String Functions', type: 'lesson' },
{ title: 'Numeric Functions', type: 'lesson' },
{ title: 'Date Functions', type: 'lesson' },
{ title: 'Conversion Functions', type: 'lesson' },
{ title: 'Logical Functions', type: 'lesson' },
{ title: 'Scalar Functions Quiz', type: 'quiz' },
{ title: 'Customer Contact List', type: 'challenge' },
{ title: 'Membership Duration', type: 'challenge' },
{ title: 'Book Performance', type: 'challenge' },
{ title: 'Book Categories', type: 'challenge' },
{ title: 'Monthly Sales Analysis', type: 'challenge' },
],
},
{
icon: <GitBranchIcon className="h-6 w-6 text-yellow-500" />,
title: 'Subqueries and CTEs',
description:
'Write complex queries using subqueries and common table expressions.',
lessonCount: 4,
challengeCount: 6,
lessons: [
{ title: 'What are Subqueries?', type: 'lesson' },
{ title: 'Correlated Subqueries', type: 'lesson' },
{ title: 'Common Table Expressions', type: 'lesson' },
{ title: 'Recursive CTEs', type: 'lesson' },
{ title: 'Subqueries Quiz', type: 'quiz' },
{ title: 'Books Above Average', type: 'challenge' },
{ title: 'Latest Category Books', type: 'challenge' },
{ title: 'Low Stock by Category', type: 'challenge' },
{ title: 'Bestseller Rankings', type: 'challenge' },
{ title: 'New Customer Analysis', type: 'challenge' },
{ title: 'Daily Sales Report', type: 'challenge' },
],
},
{
icon: <ArrowUpDownIcon className="h-6 w-6 text-yellow-500" />,
title: 'Window Functions',
description:
'Advanced analytics and calculations using window functions.',
lessonCount: 5,
challengeCount: 7,
lessons: [
{ title: 'What are they?', type: 'lesson' },
{ title: 'OVER and PARTITION BY', type: 'lesson' },
{ title: 'Use of ORDER BY', type: 'lesson' },
{ title: 'Ranking Functions', type: 'lesson' },
{ title: 'Window Frames', type: 'lesson' },
{ title: 'Window Functions Quiz', type: 'quiz' },
{ title: 'Basic Sales Metrics', type: 'challenge' },
{ title: 'Bestseller Comparison', type: 'challenge' },
{ title: 'Author Category Sales', type: 'challenge' },
{ title: 'Top Authors', type: 'challenge' },
{ title: 'Price Tier Rankings', type: 'challenge' },
{ title: 'Month-over-Month Sales', type: 'challenge' },
{ title: 'Price Range Analysis', type: 'challenge' },
],
},
];
return (
<div className="flex flex-grow flex-col items-center bg-gradient-to-b from-zinc-900 to-zinc-950 px-4 pb-52 pt-3 text-zinc-400 md:px-10 md:pt-8">
<div className="flex w-full items-center justify-between">
<a
href="https://roadmap.sh"
target="_blank"
className="opacity-20 transition-opacity hover:opacity-100"
>
<RoadmapLogoIcon />
</a>
<AccountButton />
</div>
<div className="relative mt-7 max-w-3xl text-left md:mt-20 md:text-center">
<Spotlight className="left-[-170px] top-[-200px]" fill="#EAB308" />
<div className="inline-block rounded-full bg-yellow-500/10 px-4 py-1.5 text-base text-yellow-500 md:px-6 md:py-2 md:text-lg">
<span className="hidden sm:block">
Complete Course to Master Practical SQL
</span>
<span className="block sm:hidden">Complete SQL Course</span>
</div>
<h1 className="mt-5 text-4xl font-bold tracking-tight text-white md:mt-8 md:text-7xl">
Master SQL <span className="hidden min-[384px]:inline">Queries</span>
<div className="mt-2.5 bg-gradient-to-r from-yellow-500 to-yellow-300 bg-clip-text text-transparent md:text-6xl lg:text-7xl">
From Basic to Advanced
</div>
</h1>
<p className="mx-auto my-5 max-w-2xl text-xl text-zinc-300 md:my-12 lg:text-2xl">
A structured course to master database querying - perfect for
developers, data analysts, and anyone working with data.
</p>
<div className="hidden flex-row items-center justify-center gap-5 md:flex">
<div className="flex flex-row items-center gap-2">
<ClipboardIcon className="size-6 text-yellow-600" />
<span>55+ Lessons</span>
</div>
<div className="flex flex-row items-center gap-2">
<FileQuestionIcon className="size-6 text-yellow-600" />
<span>100+ Challenges</span>
</div>
<div className="flex flex-row items-center gap-2">
<CodeIcon className="size-6 text-yellow-600" />
<span>Integrated IDE</span>
</div>
<div className="flex flex-row items-center gap-2">
<BrainIcon className="size-6 text-yellow-600" />
<span>AI Tutor</span>
</div>
</div>
<div className="mt-7 flex justify-start md:mt-12 md:justify-center">
<BuyButton variant="main" />
</div>
</div>
<SectionHeader
title="Not your average SQL course"
description="Built around a text-based interactive approach and packed with practical challenges, this course stands out with features that make it truly unique."
className="mt-16 md:mt-32"
/>
<div className="mx-auto mt-6 w-full max-w-5xl md:mt-10">
<div className="grid grid-cols-1 gap-2 md:grid-cols-2 md:gap-4 lg:grid-cols-3">
<CourseFeature
title="Textual Course"
icon={Eye}
imgUrl="https://assets.roadmap.sh/guest/textual-course.png"
description="Unlike video-based courses where you have to learn at the pace of the instructor, this course is text-based, allowing you to learn at your own pace."
/>
<CourseFeature
title="Coding Environment"
icon={CodeIcon}
imgUrl="https://assets.roadmap.sh/guest/coding-environment.png"
description="With the integrated IDE, you can practice your SQL queries in real-time, getting instant feedback on your results."
/>
<CourseFeature
title="Practical Challenges"
icon={FileQuestionIcon}
imgUrl="https://assets.roadmap.sh/guest/coding-challenges.png"
description="The course is packed with practical challenges and quizzes, allowing you to test your knowledge and skills."
/>
<CourseFeature
title="AI Instructor"
icon={BrainIcon}
description="Powerful AI tutor to help you with your queries, provide additional explanations and help if you get stuck."
imgUrl="https://assets.roadmap.sh/guest/ai-integration.png"
/>
<CourseFeature
title="Take Notes"
icon={ClipboardIcon}
description="The course allows you to take notes, where you can write down your thoughts and ideas. You can visit them later to review your progress."
imgUrl="https://assets.roadmap.sh/guest/course-notes.png"
/>
<CourseFeature
title="Completion Certificate"
icon={FileCheckIcon}
imgUrl="https://assets.roadmap.sh/guest/course-certificate.jpg"
description="The course provides a completion certificate, which you can share with your potential employers."
/>
</div>
</div>
<div className="mt-7 w-full max-w-3xl text-left md:mt-9">
<p className="text-lg leading-normal md:text-xl">
Oh, and you get the{' '}
<span className="bg-gradient-to-r from-yellow-500 to-yellow-300 bg-clip-text text-transparent">
lifetime access
</span>{' '}
to the course including all the future updates. Also, there is a
certificate of completion which you can share with your potential
employers.
</p>
</div>
<SectionHeader
title="Course Overview"
description="The course is designed to help you go from SQL beginner to expert
through hands-on practice with real-world scenarios, mastering
everything from basic to complex queries."
className="mt-8 md:mt-24"
/>
<div className="mt-8 w-full max-w-3xl space-y-4 md:mt-12">
{chapters.map((chapter, index) => (
<ChapterRow key={index} counter={index + 1} {...chapter} />
))}
</div>
<SectionHeader
title="About the Author"
className="mt-12 md:mt-24"
description={
<div className="mt-2 md:mt-4 flex flex-col gap-4 md:gap-6 text-lg md:text-xl leading-[1.52]">
<p>
I am Kamran Ahmed, an engineering leader with over a decade of
experience in the tech industry. Throughout my career I have built
and scaled software systems, architected complex data systems, and
worked with large amounts of data to create efficient solutions.
</p>
<p>
I am also the creator of{' '}
<a
href="https://roadmap.sh"
target="_blank"
className="text-yellow-400"
>
roadmap.sh
</a>
, a platform trusted by millions of developers to guide their
learning journeys. I love to simplify complex topics and make
learning practical and accessible.
</p>
<p>
In this course, I will share everything I have learned about SQL
from the basics to advanced concepts in a way that is easy to
understand and apply. Whether you are just starting or looking to
sharpen your skills, you are in the right place.
</p>
</div>
}
/>
<CourseAuthor />
<FAQSection />
<FloatingPurchase />
</div>
);
}

@ -0,0 +1,29 @@
import { cn } from '../../lib/classname';
type SectionHeaderProps = {
title: string;
description: string | React.ReactNode;
className?: string;
};
export function SectionHeader(props: SectionHeaderProps) {
const { title, description, className } = props;
return (
<div className={cn('mx-auto w-full mt-24 max-w-3xl', className)}>
<div className="relative w-full">
<div className="flex items-center gap-6">
<div className="inline-flex items-center rounded-xl ">
<span className="text-2xl md:text-3xl font-medium text-zinc-200">{title}</span>
</div>
<div className="h-[1px] flex-grow bg-gradient-to-r from-yellow-500/20 to-transparent"></div>
</div>
</div>
{typeof description === 'string' ? (
<p className="mt-2 md:mt-5 text-lg md:text-xl text-zinc-400">{description}</p>
) : (
description
)}
</div>
);
}

@ -0,0 +1,57 @@
import { cn } from '../../lib/classname';
type SpotlightProps = {
className?: string;
fill?: string;
};
export function Spotlight(props: SpotlightProps) {
const { className, fill } = props;
return (
<svg
className={cn(
'animate-spotlight pointer-events-none absolute z-[1] h-[169%] w-[238%] opacity-0 lg:w-[138%]',
className,
)}
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 3787 2842"
fill="none"
>
<g filter="url(#filter)">
<ellipse
cx="1924.71"
cy="273.501"
rx="1924.71"
ry="273.501"
transform="matrix(-0.822377 -0.568943 -0.568943 0.822377 3631.88 2291.09)"
fill={fill || 'white'}
fillOpacity="0.21"
></ellipse>
</g>
<defs>
<filter
id="filter"
x="0.860352"
y="0.838989"
width="3785.16"
height="2840.26"
filterUnits="userSpaceOnUse"
colorInterpolationFilters="sRGB"
>
<feFlood floodOpacity="0" result="BackgroundImageFix"></feFlood>
<feBlend
mode="normal"
in="SourceGraphic"
in2="BackgroundImageFix"
result="shape"
></feBlend>
<feGaussianBlur
stdDeviation="151"
result="effect1_foregroundBlur_1065_8"
></feGaussianBlur>
</filter>
</defs>
</svg>
);
}

@ -108,7 +108,6 @@ export function TopicProgressButton(props: TopicProgressButtonProps) {
useKeydown(
'r',
() => {
console.log(progress);
if (progress === 'pending') {
onClose();
return;

@ -13,10 +13,10 @@ export function UserProgressModalHeader(props: UserProgressModalHeaderProps) {
const userProgressTotal = progress?.total || 0;
const userDone = progress?.done?.length || 0;
const userSkipped = progress?.skipped?.length || 0;
const progressPercentage =
Math.round((userDone / userProgressTotal) * 100) || 0;
Math.round(((userDone + userSkipped) / userProgressTotal) * 100) || 0;
const userLearning = progress?.learning?.length || 0;
const userSkipped = progress?.skipped?.length || 0;
return (
<div className="p-4">

@ -91,7 +91,7 @@ And in case of authorization failure, i.e., if the user tries to perform an oper
Given below is the list of common authentication strategies:
- Basics of Authentication
- Basic Authentication
- Session Based Authentication
- Token-Based Authentication
- JWT Authentication

@ -0,0 +1,235 @@
---
title: 'DevOps Engineer Job Description [@currentYear@ Template]'
description: 'Create the perfect DevOps Engineer job description with our @currentYear@ template, tailored to attract top talent in today''s tech landscape.'
authorId: william
excludedBySlug: '/devops/job-description'
seo:
title: 'DevOps Engineer Job Description [@currentYear@ Template]'
description: 'Create the perfect DevOps Engineer job description with our @currentYear@ template, tailored to attract top talent in today''s tech landscape.'
ogImageUrl: 'https://assets.roadmap.sh/guest/devops-engineer-job-description-0xjml.jpg'
relatedTitle: 'Other Guides'
relatedGuidesId: 'devops'
isNew: true
type: 'textual'
date: '2025-01-17'
sitemap:
priority: 0.7
changefreq: 'weekly'
tags:
- 'guide'
- 'textual-guide'
- 'guide-sitemap'
---
![DevOps engineer job description template](https://assets.roadmap.sh/guest/devops-engineer-job-description-0xjml.jpg)
As businesses adopt agile practices to stay competitive, the demand for skilled DevOps professionals is on the rise. Hiring top talent starts with understanding their role and crafting a clear, compelling job description.
To help you create the ideal profile, this guide draws insights from top job boards like Indeed, LinkedIn, and Glassdoor. It covers the key responsibilities, essential skills, and qualifications of a [DevOps engineer](https://roadmap.sh/devops). Use this template as your go-to resource for attracting the best candidates in the field.
## DevOps engineer job description template
A DevOps engineer ensures seamless collaboration between software development and IT operations to improve software delivery speed and system reliability. Here is a DevOps engineer job description template that highlights the essential skills and qualifications that hiring managers look for in potential candidates.
**Job Title:** DevOps Engineer
**Company:** [Add your company name]
**Location:** [Specify your location]
**Job Type:** Full-time or part-time
**About Us:** [Provide company name and a quick summary of their achievements, history, and goals]
**Job description**
**[Company Name]** is looking for an experienced DevOps engineer with strong technical expertise in CI/CD pipelines, infrastructure automation, and cloud platforms, along with excellent collaboration and communication skills. The candidate should have hands-on experience with configuration management tools, a solid understanding of DevOps practices, and a working knowledge of internal backend systems. The ideal candidate will have the ability to coordinate and bridge gaps between the software developer and the operation team, ensuring a smooth workflow.
**Responsibilities**
DevOps engineer's responsibilities include:
- Designing and maintaining continuous integration/continuous deployment pipelines to automate code testing and deployment
- Tracking software performance, fixing errors, troubleshooting systems, implementing preventive measures for smooth workflows, and building automated processes
- Optimizing cloud resources and implementing cost-effective solutions
- Implementing and managing infrastructure using code rather than manual processes. Uses Terraform or CloudFormation for Infrastructure as Code (IaC) management
- Collaborating between teams helps in resolving issues quickly and deploying new features smoothly
- Monitoring and creating new processes based on performance analysis
- Better management of the software development process and implementation of configuration management tools
- Automating repetitive tasks to improve team efficiency
- Implementing security best practices, including automated compliance checks and secure code deployment
**Requirements**
The candidate must fulfill the following requirements for the DevOps engineering job profile:
- Hands-on experience with CI/CD tools
- Good experience in infrastructure as code tools
- Familiarity with monitoring and logging tools
- Proficiency in Docker for packaging applications and Kubernetes for managing containers
- Strong problem-solving and troubleshooting skills
- Excellent communication and collaboration skills to ensure proper teamwork
- Proficiency in scripting languages like Python, PowerShell, etc.
- A Bachelor's degree in Computer Science, Engineering, or a related discipline, or equivalent industry experience
For an entry-level DevOps engineer role, recruiters might look for:
- Basic knowledge of DevOps tools, i.e., Git, Jenkins, and Docker
- Familiarity with at least one programming language, such as Python or Go
- Understanding of basic networking concepts
- Willingness to embrace continuous learning and adoption of new tools
**Nice to have**
- Certification in cloud platforms like AWS certified DevOps engineer
- Good knowledge of agile methodologies and practices
**What we offer**
Highlight what your company offers, such as competitive salary, health benefits, professional development opportunities, flexible work arrangements, or other perks.
**How to apply**
If you are interested in the DevOps engineer job profile, send your resume and cover letter to your selected company [insert email address] or apply through [insert application portal link].
## Key DevOps engineer skills essential for job profile
When recruiting a DevOps engineer, prioritize candidates with strong technical expertise, adaptability, and collaboration skills. DevOps requires bridging development and operations, so focus on individuals who excel in both technical problem-solving and teamwork.
![Key DevOps engineer skills](https://assets.roadmap.sh/guest/key-devops-engineer-skills-essential-for-job-profile-qhr33.png)
Let's explore the essential skills that make a strong DevOps engineer:
### Technical skills
A DevOps job description must showcase a strong technical background and knowledge of critical concepts. Future engineers must know how to maintain tools, perform root cause analysis, develop project management capabilities, etc. A DevOps engineer must possess the following technical skills:
- **Proficiency with CI/CD tools:** Familiarity with CI/CD tools, such as Jenkins, GitLab CI/CD, or Bamboo, to automate build and deployment processes
- **Coding and scripting skills:** Knowledge of programming languages like Python, Ruby, or Java, and scripting languages like Bash
- **Cloud platform expertise:** Experience with AWS or Google Cloud platform for managing scalable infrastructure
- **Containerization and orchestration:** Understanding of Docker and Kubernetes to deploy and manage containerized applications
- **Version control systems:** Expertise in Git for code repository management and facilitating team collaboration
### Cross-functional skills
Cross-functional skills are essential for DevOps engineers to enable seamless collaboration across teams and departments. These skills help bridge the gap between technical and non-technical stakeholders for successful project outcomes, such as:
- **Problem-solving:** DevOps engineers must have the ability to quickly identify and resolve system bottlenecks or failures
- **Communication:** Strong communication skills to explain technical issues and collaborate effectively with multiple teams
- **Collaboration:** DevOps engineers must have a team-oriented mindset to bridge gaps between development, operations, and business stakeholders
Equipped with an understanding of the essential skills, the next step is evaluating candidates' familiarity with popular DevOps tools. Here's a quick guide to the tools recruiters should expect to see in job descriptions and interviews.
## Common DevOps tools and technologies
Candidates with experience in the following tools are often preferred, as they are essential for automating workflows and ensuring system reliability. Let's explore some of the most popular DevOps tools that are commonly included in job descriptions.
![Common DevOps tools and technologies](https://assets.roadmap.sh/guest/common-devops-tools-and-technologies-dwrxx.png)
### Jenkins
Jenkins is a popular open-source automation server used for continuous integration and continuous delivery (CI/CD). It streamlines software testing and deployment, reducing manual effort and accelerating development cycles. Jenkins offers several features, including:
- **Extensive plugin library:** Supports tools like Docker, Kubernetes, and Git for seamless integration
- **Real-time feedback:** Alerts teams to failed builds, enabling early issue resolution
- **Customizable pipelines:** Offers flexibility through domain-specific language (DSL) or GUI-based pipelines
### Docker
Docker is a containerization platform that packages applications and their dependencies into lightweight containers. These containers work seamlessly across different phases, from software development to production. Some of the key features of Docker are:
- **Isolated environment:** Maintains reliable application performance across different underlying systems
- **Faster deployment:** Enables quicker application deployment compared to traditional virtual machines
- **Support for microservices architecture:** Facilitates modular development, making it easier to build, deploy, and scale microservices-based applications
Discover additional use cases and strategies for [Docker](https://roadmap.sh/docker) in modern application development.
### Kubernetes
Kubernetes is an open-source orchestration platform for automating the deployment, scaling, and management of containerized applications. It works smoothly with Docker and other container runtimes to guarantee application reliability and scalability. Some standout features make Kubernetes a powerful choice, including:
- **Automated scaling:** Dynamically allocates resources to accommodate fluctuating workloads
- **Self-healing capabilities:** Automatically restarts failed containers or replaces unresponsive nodes to maintain application health
- **Service discovery and load balancing:** Efficiently distributes traffic across services, improving resource utilization and application performance
Learn how [Kubernetes](https://roadmap.sh/kubernetes) supports advanced DevOps workflows and container orchestration.
### Terraform
Terraform by HashiCorp is a tool for defining and provisioning infrastructure using declarative configuration files. This approach enables teams to automate the setup and management of cloud resources, maintaining consistency and reducing errors. Key features of Terraform include:
- **Multi-cloud support:** Supports multiple cloud providers, including AWS, Google Cloud, etc.
- **Version control:** Tracks infrastructure changes, enabling safe rollbacks and thorough audits
- **Reusable modules:** Simplifies infrastructure management with reusable and shareable code
Explore how [Terraform](https://roadmap.sh/terraform) empowers teams to manage modern infrastructure efficiently.
### Git
Git is a distributed version control system that allows developers to track code changes, collaborate on projects, and maintain a complete history of all modifications. Some of the key features of Git are:
- **Branching and merging:** Enables developers to work on different tasks simultaneously
- **Support for distributed workflows:** Enables offline work and seamless collaboration, providing flexibility for teams
- **Platform integration:** Integrates with platforms like GitHub, GitLab, and Bitbucket to streamline project management
### Prometheus and Grafana
Prometheus and Grafana are often used together to monitor and visualize application performance.
- **Prometheus:** A powerful open-source monitoring system designed for metrics collection and alerting. Features a robust query language (PromQL) for analyzing time-series data
- **Grafana:** A visualization tool that creates interactive dashboards using data from Prometheus and other sources. Enables teams to monitor key metrics in real time
Prometheus collects and stores metrics, while Grafana visualizes these metrics in customizable dashboards. This combination empowers teams to:
- Track application performance in real time
- Diagnose and resolve system issues efficiently
- Set up alerts to ensure minimal downtime and maintain service reliability
Equipped with knowledge about essential tools, it's time to explore career growth opportunities and roles within your organization that align with a DevOps engineer's expertise.
## Growth opportunities and the importance of continuous learning
By continuously building skills and staying ahead of industry trends, DevOps engineers can advance their careers and take on more strategic roles within an organization.
For hiring managers, understanding how these roles contribute to business outcomes is key to aligning talent acquisition strategies with organizational goals. By fostering career growth, organizations not only enhance employee satisfaction but also build a workforce capable of driving innovation and operational efficiency.
A candidate can choose from the different career options available in the future, such as:
### DevOps architect
A [DevOps architect](https://roadmap.sh/devops/devops-engineer) is responsible for designing enterprise-level DevOps frameworks and strategies. This role involves creating scalable frameworks, integrating tools and processes, and aligning DevOps practices with business objectives.
This role requires a deep understanding of cloud technologies, automation tools, and CI/CD pipelines to ensure seamless software delivery. DevOps architects also focus on enhancing collaboration among cross-functional teams and aligning technical initiatives with organizational goals. Their expertise helps businesses achieve faster deployments, improved quality, and greater operational efficiency.
### Site reliability engineer (SRE)
Site reliability engineers focus on maintaining system reliability and performance by leveraging automation and proactive monitoring. They develop robust recovery plans and address potential bottlenecks before they impact users.
SREs minimize downtime and enhance user experience, ensuring high availability of critical systems and boosting customer satisfaction. Their expertise directly reduces operational risks and strengthens business continuity.
### Cloud engineer
Cloud engineers specialize in managing cloud infrastructure and optimizing cloud-based solutions. They oversee multi-cloud or hybrid cloud environments while implementing security measures to protect resources.
Cloud engineers are well-versed in cloud providers like AWS or Google Cloud and work on automating resource provisioning, monitoring, and scaling to accommodate evolving business needs. They also play a crucial role in implementing cloud security measures and ensuring compliance with industry standards, enabling organizations to leverage the full potential of cloud technologies.
### Consultant or advisor
Consultants or advisors help organizations adopt DevOps best practices, select the right tools, and train teams to foster continuous improvement.
They play a critical role in driving organizational transformation by aligning DevOps initiatives with business objectives and empowering software engineering and operation teams with the skills and strategies needed to achieve long-term success in a competitive landscape.
IT dynamics are constantly changing, and staying relevant in the field of DevOps requires a commitment to continuous learning. Organizations that invest in the growth of their DevOps teams reap significant rewards:
- **Retention of top talent:** Employees are more likely to stay with companies that support their professional growth. Offering opportunities for skill development, DevOps training, certifications, and career advancement fosters loyalty and reduces turnover
- **Enhanced operational efficiency:** A well-trained DevOps team can implement cutting-edge tools and agile principles, improving workflow efficiency and reducing downtime. This directly translates to better product delivery and customer experience
- **Attracting skilled candidates:** Highlighting growth opportunities in job descriptions makes the organization more appealing to skilled candidates. Professionals in this field actively seek DevOps engineer roles where they can grow and contribute better
- **Fostering innovation:** This practice helps employees to experiment with new technologies and methodologies, driving innovation. Organizations encouraging this culture remain competitive and adaptive in a rapidly changing market
## What next?
To take the next step:
- Use the above **DevOps engineer job description template** to simplify your hiring process
- Explore our [**DevOps roadmap**](https://roadmap.sh/devops) for deeper insights into career paths and the skills that matter most
- Join the [Discord community](https://roadmap.sh/discord) to stay informed about the latest updates and meaningful discussions

@ -57,7 +57,7 @@ Key components of this culture include shared responsibility, transparency, and
![Continuous Integration and Continuous Deployment](https://assets.roadmap.sh/guest/continous-development-vs-continuous-integration-l2fak.png)
Continuous Integration (CI) and Continuous Deployment (CD) are central to DevOps principles. CI is the practice of frequently integrating code changes into a shared repository, ensuring that new code is automatically tested and validated. This practice helps catch bugs early, reducing the risk of introducing issues into the main codebase. CI allows devs and ops teams to work more efficiently, improving the overall quality of the software.
Continuous Integration (CI) and Continuous Deployment (CD) are central to DevOps principles. CI is the practice of frequently integrating code changes into a shared repository, ensuring that new code is [automatically tested](https://roadmap.sh/devops/test-automation) and validated. This practice helps catch bugs early, reducing the risk of introducing issues into the main codebase. CI allows devs and ops teams to work more efficiently, improving the overall quality of the software.
Continuous Deployment, on the other hand, takes things a step further by automatically deploying code changes to production once they pass the CI tests. This ensures that new features and bug fixes are delivered to users as quickly as possible. Together, CI and CD form a pipeline that streamlines the software development lifecycle, from code commit to production deployment in seconds (or in some cases, minutes).

@ -41,7 +41,7 @@ Over the years, testing has been somewhat considered an afterthought of the deve
The main idea is to move testing to the **left side** of the development process so that it can happen earlier and more often during the design and development phase.
Shift-Left testing aligns with the DevOps principle of continuous integration and continuous delivery (CI/CD) because automated tests can be written alongside the code and executed as part of the development pipeline. This approach ensures that issues are caught early, developers receive immediate feedback, and overall software quality is improved.
Shift-Left testing aligns with the DevOps principle of continuous integration and continuous delivery (CI/CD) because [automated tests](https://roadmap.sh/devops/test-automation) can be written alongside the code and executed as part of the development pipeline. This approach ensures that issues are caught early, developers receive immediate feedback, and overall software quality is improved.
To implement Shift-Left testing, organizations often rely on a variety of automated testing tools. While the choice of tools may vary based on team preference and specific projects, below are some popular tools for performing Shift-Left testing:

@ -23,11 +23,13 @@ tags:
![What is DevOps test automation?](https://assets.roadmap.sh/guest/devops-test-automation-nvpbi.jpg)
DevOps is a combination of cultural practices, tools, and processes that bridge the gap between development and operations teams. It aims to improve collaboration, automate workflows, and streamline the delivery of high-quality software.
Long gone are the days when putting a new feature into production meant 3 months of planning, coordination, and careful delivery. The industry has evolved into a new, faster, and more agile way of working.
The ability to deliver software quickly and reliably can make or break a business these days. Whether by responding to customer needs, addressing security vulnerabilities, or launching innovative features, the speed at which you get there is critical. This is where test automation within the DevOps framework plays a pivotal role—it helps teams accelerate delivery without sacrificing quality.
Instead of waiting for months before pushing something into production, companies are now able to do it multiple times a day, any day (even Fridays!).
Don’t believe me? Let me explain.
All of that is thanks to DevOps test automation, or in other words, thanks to the ability to include tests into the DevOps pipeline. That allows teams to confidently promote to production code that is tested and validated by a machine.
How does that all work? Let’s dive in and I’ll explain it all.
## What is DevOps Test Automation?
@ -50,7 +52,7 @@ All of this has the potential to directly translate into faster feedback loops,
### Key Components of DevOps Test Automation
1. **Continuous Integration (CI):** Continuous Integration is a DevOps practice where developers frequently merge their code changes into a shared repository. With each integration, an automated build and test process is triggered. The goals of CI include:
1. **Continuous Integration (CI):** Continuous Integration is a [DevOps](https://roadmap.sh/devops) practice where developers frequently merge their code changes into a shared repository. With each integration, an automated build and test process is triggered. The goals of CI include:
* Catching issues early by testing incremental code changes.
* Ensuring that new changes don’t break the existing codebase.
* Reducing integration problems by addressing conflicts early and often.

@ -0,0 +1,204 @@
---
title: 'DevOps vs Agile Methodology: Key Differences & Applications'
description: 'Explore the contrasts between DevOps and Agile: Understand their principles, differences, and how to apply them effectively in projects.'
authorId: ekene
excludedBySlug: '/devops/vs-agile'
seo:
title: 'DevOps vs Agile Methodology: Key Differences & Applications'
description: 'Explore the contrasts between DevOps and Agile: Understand their principles, differences, and how to apply them effectively in projects.'
ogImageUrl: 'https://assets.roadmap.sh/guest/devops-vs-agile-methodology-tlxj8.jpg'
relatedTitle: 'Other Guides'
relatedGuidesId: 'devops'
isNew: true
type: 'textual'
date: '2025-01-17'
sitemap:
priority: 0.7
changefreq: 'weekly'
tags:
- 'guide'
- 'textual-guide'
- 'guide-sitemap'
---
![DevOps vs Agile Methodology](https://assets.roadmap.sh/guest/devops-vs-agile-methodology-tlxj8.jpg)
Agile and DevOps are modern approaches organizations use to tackle some of the most complex challenges in software engineering. Whether used to complement each other or as separate methodologies, they can be tricky to define, and the line between the two often appears blurred.
Both [DevOps](https://roadmap.sh/devops) and Agile are designed to help organizations build robust software. But when should you choose one over the other? Which approach works best for your project? Can you combine them effectively to maximize their benefits?
Having worked as an engineer on projects of all sizes, I've experienced firsthand how Agile and DevOps shape workflows and outcomes. In this guide, I'll discuss their core principles, key differences, and practical use cases to help you decide which approach fits your needs.
## DevOps vs. Agile: Core values and goals
**Agile** is a project management and software development approach that breaks down the project into several dynamic phases, **known as sprints**, rather than completing an entire project at once. This approach enables teams to adapt to changes quickly, continuously improve the delivery pipeline, and stay focused on meeting customer needs.
It was born from the [Agile Manifesto](https://agilemanifesto.org/) in 2001 as a response to the challenges faced with traditional project management models like the Waterfall method, which often leads to delay, rigidness, and disconnects between customers' demands and what developers have built.
![Agile timeline](https://assets.roadmap.sh/guest/agile-timeline-174xo.png)
The core values of Agile methodologies are:
1. **Individual over processes and tools**: To prioritize human communication and collaboration across cross-functional teams rather than working independently.
2. **Responding to change over following a plan**: To embrace changing requirements at every stage of the development process. The flexibility allows the team to quickly change workflow and strategies without derailing the entire project.
3. **Customer collaboration over contract negotiation**: To incorporate continuous customer feedback and use it to shape the project deliverables and align the outcome.
4. **Working software over comprehensive documentation**: To ensure that the software that Agile teams develop works effectively, while additional tasks like documentation should not take center stage in the development process.
When you use Agile in your application development or other types of software development process, it offers numerous benefits, such as:
- Responding quickly to market changes and customer feedback.
- Improving collaboration by fostering open communication, frequent iterations, and shared ownership.
- Increasing customer satisfaction throughout the development process.
- Enhancing quality through frequent reviews and continuous testing.
- Empowering teams to innovate and solve problems creatively.
**DevOps,** on the other hand, is a set of tools, practices, and a cultural philosophy that bridges the gap between development (Dev) and operations (Ops) to enhance the delivery pipelines. It emphasizes automation, continuous integration/continuous delivery (CI/CD), and monitoring to ensure rapid and reliable software deployment.
![DevOps timeline](https://assets.roadmap.sh/guest/devops-timeline-f5wbv.png)
DevOps evolved as an extension of Agile to solve the bottleneck around operations, particularly in deployment, releases, and post-production maintenance. Its core values focus on:
1. **Collaboration and communication**: Foster a culture where developers, IT operations teams, QA teams, and other stakeholders actively collaborate and communicate throughout the development process.
2. **Automation**: Increase efficiency and minimize errors by automating repetitive tasks like testing, deployment, and infrastructure provisioning.
3. **Continuous Integration and Continuous Delivery (CI/CD)**: Implement automated pipelines to test, integrate, and deploy code quickly and reliably.
4. **Metrics and monitoring**: Use real-time monitoring and analytics to identify issues, optimize performance, and asses system health.
5. **Customer-centric focused**: Center development and operations processes around delivering value to customers with high-quality releases that meet their needs.
6. **Continuous improvement**: Establish a culture of seeing the DevOps process as not a one-time thing but rather a continuous process that promotes a feedback loop and learns from success and failure.
7. **Infrastructure as Code (IaC)**: Treat infrastructure provisioning and maintenance as code to enable version control, scalability, and reproducibility.
When you adopt DevOps in your development process, it offers numerous benefits, such as:
- Accelerating software releases with streamlined processes and automation.
- Reducing operational costs through efficient workflows and fewer bottlenecks.
- Improving software quality with automated testing to reduce bugs and enhance reliability.
- Resolving issues faster using continuous monitoring and real-time system insights.
- Enhancing security by integrating security practices into the development lifecycle (DevSecOps).
- Gaining a competitive advantage through faster innovation and the ability to adapt quickly to market changes.
Both DevOps and Agile offer numerous benefits that help you and your team build robust and scalable applications. But when should you choose one over the other? And what types of projects are best suited for each approach?
Let's dive into their usage and application next.
## Identifying when to use Agile vs. DevOps
The table below summarizes when to use Agile and DevOps:
| **Aspect** | **When to Use Agile** | **When to Use DevOps** |
| ------------------------ | ------------------------------------------------------------------------- | ---------------------------------------------------------------------------- |
| **Focus** | Building software step by step, improving as you go. | Combining development and operations for faster and smoother delivery. |
| **Team Structure** | Small teams working closely together, focusing on quick updates. | Developers and operations teams working as one from start to finish. |
| **Goal** | Deliver small, working parts of the project quickly and get feedback. | Deliver updates quickly and keep systems running smoothly. |
| **When Changes Happen** | Use Agile when project needs are likely to change often. | Use DevOps to handle changes quickly without breaking the system. |
| **Project Size** | Good for small to medium projects where teamwork and flexibility are key. | Good for large or complex projects where automation and speed are important. |
| **Release Timing** | Use Agile when you want planned updates (e.g., every two weeks). | Use DevOps when updates need to be released continuously. |
| **Tools and Automation** | Relies on planning tools like boards or trackers, with some automation. | Uses lots of automation tools to test, deploy, and monitor systems. |
| **Customer Involvement** | Use Agile when you need frequent feedback from customers. | Use DevOps when customers expect reliable and fast updates. |
| **Example Projects** | Developing a new app or adding new features to a product. | Running large systems or releasing updates to software quickly and often. |
## Focus
If your workflow is to develop software in small, manageable parts, Agile will be ideal for you. For example, if you're developing a new application, you can release the core features first, get feedback, and add more functionality over time.
DevOps, on the other hand, is perfect for delivering software quickly and maintaining its stability. For example, if you're managing a large-scale ticketing platform, DevOps ensures updates without downtime.
## Team structure
Agile works well with small teams of developers, designers, and testers where everyone can collaborate closely. For instance, if you're building a product for a startup, Agile methodology guarantees everyone is aligned.
In contrast, DevOps facilitates collaboration between development and operations teams to manage the entire process, from writing code to running in production.
## Project size and change frequency
Agile is well-suited for small to medium projects that want to launch their minimum viable product (MVP) or add new features to an existing platform that accounts for customers' needs along the way.
DevOps, in contrast, is good for large or complex projects that involve infrastructure, automation, and scalability.
## Project delivery
Agile uses sprint (a time-boxed iteration) for planned updates, which makes it ideal for projects that deliver new features frequently (e.g., every two weeks). Agile helps you stay organized and on schedule.
DevOps doesn't use intervals like Agile, and it allows you to release updates as soon as they're ready.
## Customer involvement
Agile works best when you need regular feedback from the customer at every stage of the development. In contrast, DevOps is better suited for scenarios where customers prioritize fast responses and high uptime.
## Usage of tools and automation
Agile keeps things simple by using tools like Jira and Trello to focus on planning and collaboration rather than heavy automation in DevOps, which relies on tools like Jenkins, Docker, and others that automate testing, deployment, and monitoring.
| **Agile** | **DevOps** |
| ------------------------------- | -------------------------- |
| Jira (project management) | Jenkins (CI/CD automation) |
| Trello (task tracking) | Docker (containerization) |
| Confluence (team collaboration) | Kubernetes (orchestration) |
As a rule of thumb, use the summary table below to decide when starting a new project or expanding existing ones.
| **Factor** | **Agile** | **DevOps** |
| ---------------------------------------------- | --------- | ---------- |
| **Small, cross-functional teams?** | ✅ Yes | ❌ No |
| **Large teams requiring IT operations?** | ❌ No | ✅ Yes |
| **High variability in project complexity?** | ✅ Yes | ❌ No |
| **Multi-stage delivery pipelines?** | ❌ No | ✅ Yes |
| **Rare updates (deployment frequency)?** | ✅ Yes | ❌ No |
| **Frequent releases (deployment frequency)?** | ❌ No | ✅ Yes |
| **Need for automation (CI/CD or monitoring)?** | ❌ No | ✅ Yes |
## Hybrid Situations: Combining Agile and DevOps
In complex projects, blending Agile's adaptability with DevOps' automation and deployment efficiency can produce the best results for you and your team. Below are some use cases where adopting both methodologies proves most beneficial:
- Large enterprise applications
- Mobile application development
- Microservices architecture
- Artificial Intelligence (AI) and Machine Learning (ML) projects
**Large enterprise applications**
If you're building a large enterprise application, you can break down feature development into smaller tasks and prioritize them in sprints using Agile. At the same time, DevOps helps maintain smooth delivery without downtime by automating testing, integration, and deployment. For instance, if you're managing a video streaming service, you can use Agile to plan features like personalized recommendations and DevOps to deploy them continuously to users.
**Mobile application development**
Mobile app development and maintenance involve fixing bugs, adding new features, and ensuring compatibility with new devices. Agile software development methodology can streamline feature iterations, while DevOps facilitates rapid updates across app stores. For instance, if you're building a fintech app that demands rapid feature development alongside robust security and reliability, Agile can help you build and iterate features efficiently. Meanwhile, DevOps can automate compliance checks, testing, and secure deployments to maintain quality and trust.
**Microservices architecture**
Microservices involves breaking down applications into smaller, independent services that can be developed and deployed separately. This approach aligns closely with Agile, as you can enable development teams to manage individual services, while DevOps facilitates seamless integration between these services and delivery.
**AI and ML projects**
Training, testing, and deploying AI models is an iterative process essential for keeping the system up to date. Agile practices can help you manage the iterative development of models and features, while DevOps can automate the deployment pipelines for updates and ensure effective monitoring in production.
The faster deployment cycles, higher customer satisfaction, and stable releases achieved through the combination of Agile and DevOps stem from the shared values between these software development methodologies. Let's explore these similarities further.
## Similarities between Agile and DevOps
Agile and DevOps are distinct methodologies, but they share some similarities in their goals, approaches, and principles for software development process and delivery. Below are some key similarities between Agile and DevOps:
![Agile and DevOps Similarities](https://assets.roadmap.sh/guest/similarities-between-agile-and-devops-9c79k.png)
- Both emphasize collaboration between developers, testers, and other stakeholders to break down silos and foster teamwork.
- They prioritize delivering value to customers.
- Both advocate for working in smaller chunks rather than completing the project in one big cycle.
- They align in their focus on shortening the development cycles and reducing the time to market.
- Both promote continuous learning and process optimization.
- Both encourage the use of automation tools to enhance processes and reduce manual tasks.
- Both Agile and DevOps cultures require a shift towards openness and shared responsibility.
While Agile and DevOps share common similarities, they also differ in focuses, principles, and practices. Let's explore these differences next.
## Differences between Agile and DevOps
Below are some key differences between Agile and DevOps:
- Agile primarily focuses on the development phase with an emphasis on iterative development and continuous feedback, while DevOps focuses on the entire software lifecycle by bridging the gap between development and operations.
- Agile's core principle is customer-centric and welcomes changing requirements even late in the development process, while DevOps principle is to automate repetitive tasks, strive for efficiency in the delivery pipeline, and maintain reliable systems.
- The approach Agile uses to measure success is through speed, quality of software development, and customer satisfaction. DevOps, on the other hand, uses metrics like deployment frequency, mean time to recovery (MTTR), lead time for changes, and system reliability.
- Agile promotes a culture of collaboration and adaptability among the development team, while DevOps promotes a culture of shared responsibility and accountability across the development and operations team.
## Next steps
Agile and DevOps differ in their approaches, addressing distinct aspects of software delivery. Agile is best suited for small to medium projects that change frequently and require a high degree of adaptability. In contrast, DevOps excels in medium to large projects where efficiency and reliability are paramount. Ultimately, the approach you choose depends on factors such as project complexity, release frequency, and team size.
If you're considering adopting DevOps at any stage of your development process, explore our comprehensive [DevOps roadmap](https://roadmap.sh/devops) for actionable steps and valuable resources to get started.

@ -0,0 +1,231 @@
---
title: 'DevOps vs DevSecOps: Key Differences and Best Fit'
description: 'DevOps vs DevSecOps: Learn the key differences, benefits, and how to choose the best approach for your needs and applications.'
authorId: ekene
excludedBySlug: '/devops/vs-devsecops'
seo:
title: 'DevOps vs DevSecOps: Key Differences and Best Fit'
description: 'DevOps vs DevSecOps: Learn the key differences, benefits, and how to choose the best approach for your needs and applications.'
ogImageUrl: 'https://assets.roadmap.sh/guest/devops-vs-devsecops-3drth.jpg'
relatedTitle: 'Other Guides'
relatedGuidesId: 'devops'
isNew: true
type: 'textual'
date: '2025-01-17'
sitemap:
priority: 0.7
changefreq: 'weekly'
tags:
- 'guide'
- 'textual-guide'
- 'guide-sitemap'
---
![DevOps vs DevSecOps comparison guide](https://assets.roadmap.sh/guest/devops-vs-devsecops-3drth.jpg)
Over the years, the demand for high-quality software and resilient systems has grown significantly. Businesses are under immense pressure to deliver software faster than ever. However, rushing development often comes with trade-offs, such as increased security risks that can compromise entire systems.
Traditional development practices struggled to keep up with the need for both speed and security, creating a critical challenge for organizations. To address the challenge of balancing rapid software delivery with the need for robust security and quality, two models were introduced: DevOps and DevSecOps.
[DevOps](https://roadmap.sh/devops/devops-engineer) focuses on streamlining the development and operations lifecycle to deliver software quickly. DevSecOps integrates security practices into the DevOps pipeline, prioritizing security from the start and throughout the entire development process.
In this blog, you will learn about the main purpose and role of DevOps and DevSecOps. You will also explore a comparison between the two, helping you determine which approach is right for your needs. Understanding the key differences and benefits is essential to choosing the right model, so keep reading!
Below is a quick comparison table of DevOps vs. DevSecOps for easier reference:
![DevOps vs DevSecOps](https://assets.roadmap.sh/guest/comparison-table-of-devops-vs-devsecops-wcai5.png)
## DevOps vs DevSecOps: How are they different?
Choosing between DevOps and DevSecOps can determine whether your software is fast—or secure from the start.
While both approaches aim to enhance collaboration and efficiency in software development, DevSecOps incorporates security practices early in the development cycle, unlike DevOps, which often addresses security issues at a later stage.
DevOps is primarily focused on improving collaboration between design, development, and operations teams to speed up software delivery. The core idea is to remove bottlenecks and inefficiencies in the development pipeline. DevOps engineers are skilled in coding, automation, and system administration, and they focus on delivering high-quality software with minimal errors, often through continuous integration and continuous delivery (CI/CD).
![Understanding DevOps vs DevSecOps](https://assets.roadmap.sh/guest/understanding-devops-vs-devsecops-v9tkn.png)
On the other hand, DevSecOps brings security into the equation by integrating cybersecurity practices throughout the development process. This approach arose to address increasing cyber threats by embedding security checks at every phase of the software development lifecycle (SDLC). While DevOps ensures quick software delivery, DevSecOps emphasizes secure and compliant software delivery by shifting security considerations to the left of the development timeline, ensuring that vulnerabilities are detected early.
## Security in DevOps vs. DevSecOps: A Closer Look at Processes and Tools
One of the key differences between DevOps and DevSecOps lies in how they handle security.
In DevOps workflows, security testing typically occurs near the end of the development cycle, during quality assurance or post-deployment. Security measures, such as patches or vulnerability scanning, are often applied as part of the release process. This can result in delayed launches or costly remediation efforts if critical issues are discovered late in the pipeline.
![Role of security in DevOps and DevSecOps](https://assets.roadmap.sh/guest/role-of-security-rj7j1.png)
DevSecOps, on the other hand, focuses on strengthening deployment security and maintaining data protection and compliance by tracking issues as they arise. This approach uses both shift-left and shift-right security testing strategies. Shift-left testing involves identifying security vulnerabilities early in the development process, even before code is merged.
Tools like static application security testing (SAST), dynamic application security testing (DAST), and dependency checkers are embedded into CI/CD pipelines to catch publicly disclosed vulnerabilities. Additionally, automated scanners and code analyzers ensure that potential risks are flagged before reaching production.
For example, in a DevOps environment, a team may identify security vulnerabilities only after a routine code audit or during the final phase of testing. However, in a DevSecOps setup, automated security checks would be integrated into the CI/CD pipeline, flagging issues in real time before code is deployed, saving time and mitigating risks.
### Shift-Left and Shift-Right Strategies
[Shift-left security](https://roadmap.sh/devops/shift-left-testing) is a key component of DevSecOps. With early security testing (shift-left security), DevSecOps helps identify risks and prevent product compromise. Addressing errors during the production phase is far more cost-effective than fixing them after deployment. Additionally, continuous security testing reinforces compliance with industry standards.
A real-world example of the consequences of delayed security integration is the [2017 Equifax data breach](https://archive.epic.org/privacy/data-breach/equifax/). A known vulnerability in the Apache Struts framework was left unpatched, resulting in the exposure of sensitive customer data for over 147 million people. Had shift-left security practices been in place, experts could have flagged the outdated library during the early stages of development and prevented breach.
Shift-right testing is equally important as it helps developers detect security threats and fix issues in real time. Delays in security threat detection can impact product integrity and customer trust.
For instance, imagine your organization is working on financial application design and creation, in which security processes and tests are supposed to run during the final phase.
In such a case, issue detection at the final phase can result in a product launch delay. This might also result in higher costs. And if, to avoid a launch delay, you introduce the product, the unresolved vulnerabilities can hamper your reputation and erode customer trust.
DevSecOps understands the impact of security issues and thus recommend implementing shift-left and shift-right strategies that help reduce vulnerabilities and achieve faster time to market, thus saving your organizational reputation and customer trust.
## How DevOps and DevSecOps affect business goals?
DevOps and DevSecOps affects several key business such as time to market, customer satisfaction, operational efficiency and risk management. Here's how they affect these goals.
1. **Time to market**
DevOps speeds up product delivery by automating workflows, removing bottlenecks and enabling faster iterations. DevSecOps puts the necessary checks in place without derailing development timelines so businesses can maintain a regular release cadence and meet market demand.
2. **Customer satisfaction**
DevOps delivers frequent updates and new features to keep up with customer demand and improve user experience. DevSecOps builds on this by delivering secure and reliable products, reducing the risk of issues that will frustrate users. Both speed and reliability increases customer trust and loyalty.
3. **Operational efficiency**
DevOps makes tasks more efficient by removing duplication, eliminating manual intervention. DevSecOps adds to this by addressing risks early, avoiding rework or operational downtime. Together they reduce development costs and increase productivity.
4. **Risk management**
DevOps allows for faster iterations and deployments which can introduce risks if not managed properly. DevSecOps mitigates these risks by making security a core part of the development lifecycle. This proactive approach reduces the chance of breaches or compliance issues and protects the business's reputation and financials.
## Core processes in DevOps and DevSecOps
To accelerate the software development lifecycle, DevOps gives more attention to automation and collaboration. Monitoring in DevOps primarily focuses on performance, availability, and system uptime. Metrics like CPU utilization, application response times, and log aggregation form the foundation of DevOps monitoring strategies. Incident response, while essential, is reactive in nature—triggered only after an issue, such as a system crash or performance degradation, arises.
DevSecOps runs security tests in all stages, thus adopting a more proactive approach. Continuous monitoring in DevSecOps goes beyond traditional metrics to include threat detection, vulnerability scanning, and compliance checks, focusing more on reducing risk and cost. They even use tools like SIEM (Security Information and Event Management) systems and cloud-native security platforms to detect threats in real time. Incident response in DevSecOps involves automated playbooks and AI-driven analysis to address vulnerabilities, often before they can escalate rapidly.
In fact, DevSecOps even employs SAST and DAST strategies that help identify security vulnerabilities faster. Under SAST, professionals scans source code early to prevent vulnerabilities from entering production. Some of the common SAST tools are [SonarQube](https://www.sonarsource.com/products/sonarqube/) and [Checkmarx](https://checkmarx.com/).
However, under the DAST strategy, professionals evaluate applications in their running state to identify vulnerabilities. Some of the common DAST tools are [OWASP ZAP](https://www.zaproxy.org/) and [Burp Suite](https://portswigger.net/burp) which help identify injection flaws and security misconfigurations.
Interactive Application Security Testing (IAST), another practice that combines SAST and DAST, operates within the application runtime environment to provide detailed insights into vulnerabilities during testing and QA phases.
These advanced testing methodologies—SAST, DAST, and IAST—not only enhance security within specific stages of development but also lay the groundwork for broader, innovative practices in continuous security monitoring. These emerging practices are redefining traditional monitoring and response strategies:
### Real-time incident response
DevSecOps uses tools that use AI and machine learning for real-time threat detection and mitigation.
Examples of tools used for real-time response in DevSecOps are Splunk, Datadog Security Monitoring, and CrowdStrike Falcon which do AI driven threat detection and automated response to incidents.
### Behavioral analytics
Monitoring user and application behavior allows DevSecOps teams to detect anomalies such as data access or traffic patterns that are not normal which could be a breach.
### Continuous compliance
DevSecOps embeds security policies and regulatory standards into the development process for continuous compliance. Teams use automation tools like Policy-as-Code frameworks and compliance scanners to enforce and validate against standards like GDPR, HIPAA and PCI DSS. This way reduces compliance risks and makes auditing easier.
## Moving from DevOps to DevSecOps
The shift from DevOps to DevSecOps is a critical evolution for organizations aiming to integrate security seamlessly into the development lifecycle. Below is a guide to facilitate this transition, highlighting actionable steps, tools, and strategies for success.
![Moving from DevOps to DevSecOps](https://assets.roadmap.sh/guest/moving-from-devops-to-devsecops-gimtv.png)
### Understand your goals
Before implementing changes or transitioning to DevSecOps, step back and clearly lay out your goals. What do you want to achieve after transitioning to the DevSecOps model? Are you looking to enhance your security model or need faster software deployment? Being specific with your goals will help you make informed decisions and develop a plan that aligns with your goals.
### Current flow assessment
Before transitioning to a new model, it is important to assess and change the existing workflow. Trace areas that demand or require improvement and attention. For example, check for proper coordination and communication among development, operations, and security teams. Are there any loopholes or faults in your current workflow? Is security feedback consistently integrated into development cycles? Identify gaps in communication by hosting cross-team retrospectives or root cause analyses to get more clarity of your situation.
### Choose the right automation tool
If workflow efficiency is an issue, the best solution is to implement automation tools. These common tools can reduce manual tasks, run faster code reviews, perform security tests, and provide quick deployment. These tools further enable professionals to focus on other key areas and tasks, such as fixing errors or initiating new features within the application.
### Training your teammates
Adopting a new model or practice demands educating the team members early about the new process and security concerns. Inform and educate your team members about the importance of security systems and how integrating them can improve their overall performance. You can also run training sessions or seminars to cover more about security guidelines and standards. Help them understand rising security concerns, how to fill gaps, and how to integrate security throughout the software development cycle. This step will further prevent confusion and problems from escalating in the future.
Educating the team early prevents missteps and promotes ownership of the new processes. But. despite preparation, organizations often face challenges when transitioning to DevSecOps. Let us have a look at some common DevSecOps transition challenges.
## DevSecOps transition challenges
Remember, transitioning to DevSecOps is not as easy as it sounds. There are various challenges, but these can be managed with the right approach.
![DevSecOps Best Practices](https://assets.roadmap.sh/guest/devsecops-best-practices-m6e21.png)
Here are a few things to avoid in the transition period:
### Wrong tool selection
There is a wide range of security applications on the market, but make sure to select the right one that is relevant to your code and meets your requirements. Otherwise, you might find it hard to run in the long term.
For guidance on selecting the right tools, refer to the **"Top tools and processes for a smooth transition"** section, where we highlight specific tools and best practices to facilitate a successful DevSecOps implementation.
### Non-inclusion of operations and security teams
Security tests are conducted at every phase of software production. Excluding your operations and security teams from the monitoring and tracking process limits the ability to identify and address faults and bugs effectively. Involving security experts from the start allows them to provide guidance on misconfigurations, tools, and best practices.
### Speed over quality
DevOps emphasizes quick software delivery, which can sometimes lead to insufficient attention to quality and security functionality. This may affect the user experience and your business reputation. Allocating more time and effort to ensuring quality and integrating security practices can help strike the right balance.
### Code monitoring issues
Since code constantly changes in software production, keeping an eye on it at all times can be challenging for some professionals. It is important to introduce new configurations, tools, and practices that can identify vulnerabilities in real time.
## Top tools and processes for a smooth transition
Further, let's have a look at how tools and processes can facilitate this transition while keeping agility and innovation intact:
### Prioritize security-first CI/CD configurations
CI/CD pipelines are the backbone of modern DevOps workflows. Embedding security into these pipelines guarantees vulnerabilities are identified and mitigated at every stage. So, incorporate Static Application Security Testing (SAST) and Dynamic Application Security Testing (DAST) tools.
Further, you can use HashiCorp Vault or AWS Secrets Manager to manage sensitive information securely. Also, you can set policies to prevent critical vulnerabilities using tools such as Jenkins, GitHub Actions, or GitLab CI/CD.
### Embrace infrastructure as code (IaC)
IaC automates infrastructure provisioning, but it can also introduce risks if not properly secured. For a smooth transition, it is important to integrate security into IaC processes. You can invest in Terrascan to detect vulnerabilities in Terraform or implement [immutable infrastructure practices](https://devops.com/immutable-infrastructure-the-next-step-for-devops/) to reduce configuration drift. Further, you can conduct regular audits of IaC templates for misconfigurations.
### Leverage advanced threat modeling
Threat modeling makes certain that potential security risks are identified early. New tools and frameworks make this process more effective. Invest in AI-powered tools that automatically suggest mitigations for identified risks.
## Will DevSecOps replace DevOps?
No, DevSecOps will not replace DevOps; instead, it enhances it. Rather than being a replacement, DevSecOps is an augmentation of DevOps, bringing security practices into the development and operations workflow. While DevOps focuses on speed, collaboration, and efficiency, DevSecOps makes sure that security becomes an inherent part of these processes. For example, integrating security tools like Snyk or SonarQube into CI/CD pipelines helps organizations identify vulnerabilities early in the development lifecycle.
The two are complementary rather than mutually exclusive. DevSecOps acts as a bridge, ensuring that security doesn't become a bottleneck while maintaining the agility of DevOps. This natural evolution addresses the growing need for secure software development without compromising agility. Let's further learn how DevSecOps will evolve in the future.
## Future of DevSecOps
As cybersecurity threats grow in sophistication and compliance regulations tighten, DevSecOps is poised to become the cornerstone of secure software development. In the recent [CrowdStrike Global Threat Report](https://www.crowdstrike.com/en-us/global-threat-report/), experts clearly stated that attacks hardly take a few minutes to succeed. The report also noted that CrowdStrike tracked over 230 adversaries leveraging the global adoption of cloud technologies for their attacks.
Tackling these challenges demands strategic teamwork and technical expertise. Here's how DevSecOps is expected to evolve and why it is the future of secure DevOps:
### Proactive threat mitigation
DevSecOps is transitioning from reactive to proactive security. Predictive threat analysis, enabled by AI and machine learning, will play a crucial role in identifying security concerns and vulnerabilities before exploitation. For example, tools like CrowdStrike will become essential for analyzing attack patterns.
### Integration with governance and compliance
Stricter regulations such as GDPR, HIPAA, and CCPA are driving a compliance-first culture. DevSecOps will increasingly integrate automated compliance checks into CI/CD pipelines, facilitating adherence to global standards without manual intervention.
### Rise of zero-trust architectures
The adoption of zero-trust principles will reshape security frameworks. DevSecOps will integrate zero-trust policies into development environments, guaranteeing continuous authentication and access verification. This approach will strengthen security for microservices and API-driven architectures.
### Cloud-native and container security
With the surge in cloud-native applications, securing containers and serverless environments will be a top priority. Several tools are available that will enable seamless security integration into cloud workloads, addressing misconfigurations and runtime vulnerabilities.
In fact, in terms of demand and salary, DevSecOps roles tend to offer higher salaries due to the specialized skill set and smaller talent pool. Soon, there will be more role demands for security teams, DevSecOps engineers, security automation specialists, and compliance analysts job profiles to integrate security systems throughout the SDLC.
## Conclusion
Now that you understand the differences between DevOps and DevSecOps, the choice comes down to your organization's specific goals and priorities.
If speed and efficiency are your primary focus, DevOps is a great fit. However, if security is paramount, DevSecOps is the better choice. By embedding security into every stage of the development lifecycle, DevSecOps helps mitigate vulnerabilities while ensuring compliance and quality.
Both methodologies hold significant value, but in an era of increasing cybersecurity threats, DevSecOps is becoming essential for organizations that prioritize secure innovation.
To navigate these approaches effectively and align them with your long-term goals—such as scalability, compliance, and reputation—explore our comprehensive [DevOps roadmap](https://roadmap.sh/devops). It offers actionable insights to help you build a strategy that drives efficiency, security, and success.

@ -32,7 +32,7 @@ In this project, instead of relying on our own weather data, we will build a wea
As for the actual weather API to use, you can use your favorite one, as a suggestion, here is a link to [Visual Crossing’s API](https://www.visualcrossing.com/weather-api), it’s completely FREE and easy to use.
Regarding the in-memory cache, a pretty common recommendation is to use [Redis](https://redis.io/), you can read more about it [here](https://redis.io/docs/manual/client-side-caching/), and as a recommendation, you could use the city code entered by the user as the key, and save there the result from calling the API.
Regarding the in-memory cache, a pretty common recommendation is to use [Redis](https://redis.io/), you can read more about it [here](https://redis.io/docs/latest/develop/clients/client-side-caching/), and as a recommendation, you could use the city code entered by the user as the key, and save there the result from calling the API.
At the same time, when you “set” the value in the cache, you can also give it an expiration time in seconds (using the `EX` flag on the `SET` command). That way the cache (the keys) will automatically clean itself when the data is old enough (for example, giving it a 12-hours expiration time).

File diff suppressed because one or more lines are too long

@ -1,4 +1,4 @@
# Bias and Faireness
# Bias and Fairness
Bias and fairness in AI refer to the challenges of ensuring that machine learning models do not produce discriminatory or skewed outcomes. Bias can arise from imbalanced training data, flawed assumptions, or biased algorithms, leading to unfair treatment of certain groups based on race, gender, or other factors. Fairness aims to address these issues by developing techniques to detect, mitigate, and prevent biases in AI systems. Ensuring fairness involves improving data diversity, applying fairness constraints during model training, and continuously monitoring models in production to avoid unintended consequences, promoting ethical and equitable AI use.

File diff suppressed because one or more lines are too long

@ -1,6 +1,8 @@
# DML (Data Manipulation Language)
The SQL commands that deals with the manipulation of data present in the database belong to DML or Data Manipulation Language and this includes most of the SQL statements. It is the component of the SQL statement that controls access to data and to the database. Basically, DCL statements are grouped with DML statements.
The SQL commands that manipulate data in the database belong to DML, or Data Manipulation Language, and this includes most of the SQL statements. DCL is the component of the SQL statement that controls access to data and to the database. Basically, DCL statements are grouped with DML statements.
Visit the following resources to learn more:
- [@article@DML: Data Manipulation Language](https://satoricyber.com/glossary/dml-data-manipulation-language)
- [@article@Difference Between DDL and DML](https://appmaster.io/blog/difference-between-ddl-and-dml)

@ -1,5 +1,7 @@
# How Computers Calculate?
Computers calculate using the binary system, where all data is represented as 0s and 1s. These binary states correspond to the ON/OFF positions of transistors, which are the building blocks of logic gates (AND, OR, NOT). Numbers, characters, and instructions are broken into binary sequences (bits), and grouped into bytes (8 bits). Arithmetic operations like addition are performed through logic gates, which combine binary values. The CPU executes these calculations by following a fetch-decode-execute cycle. Complex calculations, such as handling decimals, use floating-point representation. Programs written in high-level languages are compiled into machine code for the CPU to execute.
Visit the following resources to learn more:
- [@video@How computers calculate - ALU](https://youtu.be/1I5ZMmrOfnA)

@ -1,6 +1,12 @@
# P = NP
The P = NP problem is one of the most famous problems in computer science. It asks if the problem of determining if a given input belongs to a certain class of problems is as hard as the problem of solving the given input. In other words, it asks if the problem of determining if a given input belongs to a certain class of problems is as hard as the problem of determining if a given input belongs to a certain class of problems. This problem is also known as the Halting Problem.
The P = NP problem is one of the most famous problems in computer science. It asks whether a problem that can be solved in polynomial time on a non-deterministic machine (i.e., the problem is in NP) can also be solved in polynomial time on a deterministic machine (i.e., the problem is in P).
If you can find a polynomial-time solution to an NP-complete problem, then all problems in NP can be solved in polynomial time. This shows that P = NP.
If you can prove for any single NP-complete problem that it is only solvable in exponential time, then all NP-complete problems are only solvable in exponential time. This shows that P ≠ NP.
So far, we don't know whether P = NP or P ≠ NP.
Visit the following resources to learn more:

@ -2,5 +2,7 @@
Excel is a powerful tool utilized by data analysts worldwide to store, manipulate, and analyze data. It offers a vast array of features such as pivot tables, graphs and a powerful suite of formulas and functions to help sift through large sets of data. A data analyst uses Excel to perform a wide range of tasks, from simple data entry and cleaning, to more complex statistical analysis and predictive modeling. Proficiency in Excel is often a key requirement for a data analyst, as its versatility and ubiquity make it an indispensable tool in the field of data analysis.
Learn more from the following resources:
- [@article@W3Schools - Excel](https://www.w3schools.com/excel/index.php)
- [@course@Microsoft Excel Course](https://support.microsoft.com/en-us/office/excel-video-training-9bc05390-e94c-46af-a5b3-d7c22f6990bb)

@ -5,4 +5,4 @@ Application Programming Interfaces, better known as APIs, play a fundamental rol
Learn more from the following resources:
- [@article@What is an API?](https://aws.amazon.com/what-is/api/)
- [@article@A beginners guide to APIs](https://www.postman.com/what-is-an-api/)
- [@article@A Beginner's Guide to APIs](https://www.postman.com/what-is-an-api/)

@ -1,8 +1,8 @@
# Average
# Average
When focusing on data analysis, understanding key statistical concepts is crucial. Amongst these, central tendency is a foundational element. Central Tendency refers to the measure that determines the center of a distribution. The average is a commonly used statistical tool by which data analysts discern trends and patterns. As one of the most recognized forms of central tendency, figuring out the "average" involves summing all values in a data set and dividing by the number of values. This provides analysts with a 'typical' value, around which the remaining data tends to cluster, facilitating better decision-making based on existing data.
Learn more from the following resources:
- [@article@How to calculate the average](https://support.microsoft.com/en-gb/office/calculate-the-average-of-a-group-of-numbers-e158ef61-421c-4839-8290-34d7b1e68283#:~:text=Average%20This%20is%20the%20arithmetic,by%206%2C%20which%20is%205.)
- [@article@How to Calculate the Average](https://support.microsoft.com/en-gb/office/calculate-the-average-of-a-group-of-numbers-e158ef61-421c-4839-8290-34d7b1e68283#:~:text=Average%20This%20is%20the%20arithmetic,by%206%2C%20which%20is%205.)
- [@article@Average Formula](https://www.cuemath.com/average-formula/)

@ -4,5 +4,5 @@ As a vital tool in the data analyst's arsenal, bar charts are essential for anal
Learn more from the following resources:
- [@article@A complete guide to bar charts](https://www.atlassian.com/data/charts/bar-chart-complete-guide)
- [@video@What is a bar chart?](https://www.youtube.com/watch?v=WTVdncVCvKo)
- [@article@A Complete Guide to Bar Charts](https://www.atlassian.com/data/charts/bar-chart-complete-guide)
- [@video@What is a Bar Chart?](https://www.youtube.com/watch?v=WTVdncVCvKo)

@ -1,3 +1,7 @@
# Big Data and Data Analyst
In the modern digitized world, Big Data refers to extremely large datasets that are challenging to manage and analyze using traditional data processing applications. These datasets often come from numerous different sources and are not only voluminous but also diverse in nature, including structured and unstructured data. The role of a data analyst in the context of big data is crucial. Data analysts are responsible for inspecting, cleaning, transforming, and modeling big data to discover useful information, conclude and support decision-making. They leverage their analytical skills and various big data tools and technologies to extract insights that can benefit the organization and drive strategic business initiatives.
In the modern digitized world, Big Data refers to extremely large datasets that are challenging to manage and analyze using traditional data processing applications. These datasets often come from numerous different sources and are not only voluminous but also diverse in nature, including structured and unstructured data. The role of a data analyst in the context of big data is crucial. Data analysts are responsible for inspecting, cleaning, transforming, and modeling big data to discover useful information, conclude and support decision-making. They leverage their analytical skills and various big data tools and technologies to extract insights that can benefit the organization and drive strategic business initiatives.
Learn more from the following resources:
- [@article@Big Data Analytics](https://www.ibm.com/think/topics/big-data-analytics)

@ -4,5 +4,5 @@ The Cleanup of Data is a critical component of a Data Analyst's role. It involve
Learn more from the following resources:
- [@article@Top 10 ways to clean your data](https://support.microsoft.com/en-gb/office/top-ten-ways-to-clean-your-data-2844b620-677c-47a7-ac3e-c2e157d1db19)
- [@article@Top 10 Ways to Clean Your Data](https://support.microsoft.com/en-gb/office/top-ten-ways-to-clean-your-data-2844b620-677c-47a7-ac3e-c2e157d1db19)
- [@video@Master Data Cleaning Essentials on Excel in Just 10 Minutes](https://www.youtube.com/watch?v=jxq4-KSB_OA)

@ -1,3 +1,7 @@
# Data Cleaning
Data cleaning, which is often referred as data cleansing or data scrubbing, is one of the most important and initial steps in the data analysis process. As a data analyst, the bulk of your work often revolves around understanding, cleaning, and standardizing raw data before analysis. Data cleaning involves identifying, correcting or removing any errors or inconsistencies in datasets in order to improve their quality. The process is crucial because it directly determines the accuracy of the insights you generate - garbage in, garbage out. Even the most sophisticated models and visualizations would not be of much use if they're based on dirty data. Therefore, mastering data cleaning techniques is essential for any data analyst.
Data cleaning, which is often referred as data cleansing or data scrubbing, is one of the most important and initial steps in the data analysis process. As a data analyst, the bulk of your work often revolves around understanding, cleaning, and standardizing raw data before analysis. Data cleaning involves identifying, correcting or removing any errors or inconsistencies in datasets in order to improve their quality. The process is crucial because it directly determines the accuracy of the insights you generate - garbage in, garbage out. Even the most sophisticated models and visualizations would not be of much use if they're based on dirty data. Therefore, mastering data cleaning techniques is essential for any data analyst.
Learn more from the following resources:
- [@article@Data Cleaning](https://www.tableau.com/learn/articles/what-is-data-cleaning#:~:text=tools%20and%20software-,What%20is%20data%20cleaning%3F,to%20be%20duplicated%20or%20mislabeled.)

@ -1,3 +1,7 @@
# Data Collection
In the context of the Data Analyst role, data collection is a foundational process that entails gathering relevant data from various sources. This data can be quantitative or qualitative and may be sourced from databases, online platforms, customer feedback, among others. The gathered information is then cleaned, processed, and interpreted to extract meaningful insights. A data analyst performs this whole process carefully, as the quality of data is paramount to ensuring accurate analysis, which in turn informs business decisions and strategies. This highlights the importance of an excellent understanding, proper tools, and precise techniques when it comes to data collection in data analysis.
Data collection is a foundational process that entails gathering relevant data from various sources. This data can be quantitative or qualitative and may be sourced from databases, online platforms, customer feedback, among others. The gathered information is then cleaned, processed, and interpreted to extract meaningful insights. A data analyst performs this whole process carefully, as the quality of data is paramount to ensuring accurate analysis, which in turn informs business decisions and strategies. This highlights the importance of an excellent understanding, proper tools, and precise techniques when it comes to data collection in data analysis.
Learn more from the following resources:
- [@article@Data Collection](https://en.wikipedia.org/wiki/Data_collection)

@ -1,3 +1,9 @@
# Data Manipulation Libraries
Data manipulation libraries are essential tools in data science and analytics, enabling efficient handling, transformation, and analysis of large datasets. Python, a popular language for data science, offers several powerful libraries for this purpose. Pandas is a highly versatile library that provides data structures like DataFrames, which allow for easy manipulation and analysis of tabular data. NumPy, another fundamental library, offers support for large, multi-dimensional arrays and matrices, along with a collection of mathematical functions to operate on these arrays. Together, Pandas and NumPy form the backbone of data manipulation in Python, facilitating tasks such as data cleaning, merging, reshaping, and statistical analysis, thus streamlining the data preparation process for machine learning and other data-driven applications.
Data manipulation libraries are essential tools in data science and analytics, enabling efficient handling, transformation, and analysis of large datasets. Python, a popular language for data science, offers several powerful libraries for this purpose. Pandas is a highly versatile library that provides data structures like DataFrames, which allow for easy manipulation and analysis of tabular data. NumPy, another fundamental library, offers support for large, multi-dimensional arrays and matrices, along with a collection of mathematical functions to operate on these arrays. Together, Pandas and NumPy form the backbone of data manipulation in Python, facilitating tasks such as data cleaning, merging, reshaping, and statistical analysis, thus streamlining the data preparation process for machine learning and other data-driven applications.
Learn more from the following resources:
- [@article@Pandas](https://pandas.pydata.org/)
- [@article@NumPy](https://numpy.org/)
- [@article@Top Python Libraries for Data Science](https://www.simplilearn.com/top-python-libraries-for-data-science-article)

@ -4,5 +4,5 @@ As a business enterprise expands, so does its data. For data analysts, the surge
Learn more from the following resources:
- [@official@SQL Roadmap](https://roadmap.sh/sql)
- [@official@PostgreSQL Roadmap](https://roadmap.sh/postgresql-dba)
- [@roadmap@Visit Dedicated SQL Roadmap](https://roadmap.sh/sql)
- [@roadmap@Visit Dedicated PostgreSQL Roadmap](https://roadmap.sh/postgresql-dba)

@ -2,5 +2,7 @@
Data Transformation, also known as Data Wrangling, is an essential part of a Data Analyst's role. This process involves the conversion of data from a raw format into another format to make it more appropriate and valuable for a variety of downstream purposes such as analytics. Data Analysts transform data to make the data more suitable for analysis, ensure accuracy, and to improve data quality. The right transformation techniques can give the data a structure, multiply its value, and enhance the accuracy of the analytics performed by serving meaningful results.
Learn more from the following resources:
- [@article@What is data transformation?](https://www.qlik.com/us/data-management/data-transformation)
- [@feed@Explore top posts about Data Analysis](https://app.daily.dev/tags/data-analysis?ref=roadmapsh)

@ -1,3 +1,5 @@
# Data Visualisation Libraries
# Data Visualization Libraries
Data visualization libraries are crucial in data science for transforming complex datasets into clear and interpretable visual representations, facilitating better understanding and communication of data insights. In Python, several libraries are widely used for this purpose. Matplotlib is a foundational library that offers comprehensive tools for creating static, animated, and interactive plots. Seaborn, built on top of Matplotlib, provides a high-level interface for drawing attractive and informative statistical graphics with minimal code. Plotly is another powerful library that allows for the creation of interactive and dynamic visualizations, which can be easily embedded in web applications. Additionally, libraries like Bokeh and Altair offer capabilities for creating interactive plots and dashboards, enhancing exploratory data analysis and the presentation of data findings. Together, these libraries enable data scientists to effectively visualize trends, patterns, and outliers in their data, making the analysis more accessible and actionable.
Data visualization libraries are crucial in data science for transforming complex datasets into clear and interpretable visual representations, facilitating better understanding and communication of data insights. In Python, several libraries are widely used for this purpose. Matplotlib is a foundational library that offers comprehensive tools for creating static, animated, and interactive plots. Seaborn, built on top of Matplotlib, provides a high-level interface for drawing attractive and informative statistical graphics with minimal code. Plotly is another powerful library that allows for the creation of interactive and dynamic visualizations, which can be easily embedded in web applications. Additionally, libraries like Bokeh and Altair offer capabilities for creating interactive plots and dashboards, enhancing exploratory data analysis and the presentation of data findings. Together, these libraries enable data scientists to effectively visualize trends, patterns, and outliers in their data, making the analysis more accessible and actionable.
Learn more from the following resources:

@ -1,3 +1,7 @@
# Data Visualization
Data Visualization is a fundamental fragment of the responsibilities of a data analyst. It involves the presentation of data in a graphical or pictorial format which allows decision-makers to see analytics visually. This practice can help them comprehend difficult concepts or establish new patterns. With interactive visualization, data analysts can take the data analysis process to a whole new level — drill down into charts and graphs for more detail, and interactively changing what data is presented or how it’s processed. Thereby it forms a crucial link in the chain of converting raw data to actionable insights which is one of the primary roles of a Data Analyst.
Data Visualization is a fundamental fragment of the responsibilities of a data analyst. It involves the presentation of data in a graphical or pictorial format which allows decision-makers to see analytics visually. This practice can help them comprehend difficult concepts or establish new patterns. With interactive visualization, data analysts can take the data analysis process to a whole new level — drill down into charts and graphs for more detail, and interactively changing what data is presented or how it’s processed. Thereby it forms a crucial link in the chain of converting raw data to actionable insights which is one of the primary roles of a Data Analyst.
Learn more from the following resources:
- [@article@What is Data Visualization?](https://www.ibm.com/think/topics/data-visualization)

@ -4,5 +4,5 @@ Behind every strong data analyst, there's not just a rich assortment of data, bu
Learn more from the following resources:
- [@official@PostgreSQL Roadmap](https://roadmap.sh/postgresql-dba)
- [@official@MongoDB Roadmap](https://roadmap.sh/mongodb)
- [@roadmap@Visit Dedicated SQL Roadmap](https://roadmap.sh/sql)
- [@roadmap@Visit Dedicated PostgreSQL Roadmap](https://roadmap.sh/postgresql-dba)

@ -2,7 +2,7 @@
The `DATEDIF` function is an incredibly valuable tool for a Data Analyst in Excel or Google Sheets, by providing the ability to calculate the difference between two dates. This function takes in three parameters: start date, end date and the type of difference required (measured in years, months, days, etc.). In Data Analysis, particularly when dealing with time-series data or when you need to uncover trends over specific periods, the `DATEDIF` function is a necessary asset. Recognizing its functionality will enable a data analyst to manipulate or shape data progressively and efficiently.
* `DATEDIF` is technically still supported, but wont show as an option. For additional information, see Excel "Help" page.
`DATEDIF` is technically still supported, but wont show as an option. For additional information, see Excel "Help" page.
Learn more from the following resources:

@ -1,3 +1,7 @@
# Deep Learning and Data Analysis
Deep learning, a subset of machine learning technique, is increasingly becoming a critical tool for data analysts. Deep learning algorithms utilize multiple layers of neural networks to understand and interpret intricate structures in large data, a skill that is integral to the daily functions of a data analyst. With the ability to learn from unstructured or unlabeled data, deep learning opens a whole new range of possibilities for data analysts in terms of data processing, prediction, and categorization. It has applications in a variety of industries from healthcare to finance to e-commerce and beyond. A deeper understanding of deep learning methodologies can augment a data analyst's capability to evaluate and interpret complex datasets and provide valuable insights for decision making.
Deep learning, a subset of machine learning technique, is increasingly becoming a critical tool for data analysts. Deep learning algorithms utilize multiple layers of neural networks to understand and interpret intricate structures in large data, a skill that is integral to the daily functions of a data analyst. With the ability to learn from unstructured or unlabeled data, deep learning opens a whole new range of possibilities for data analysts in terms of data processing, prediction, and categorization. It has applications in a variety of industries from healthcare to finance to e-commerce and beyond. A deeper understanding of deep learning methodologies can augment a data analyst's capability to evaluate and interpret complex datasets and provide valuable insights for decision making.
Learn more from the following resources:
- [@article@Deep Learning for Data Analysis](https://www.ibm.com/think/topics/deep-learning)

@ -4,5 +4,5 @@ Data cleaning plays a crucial role in the data analysis pipeline, where it recti
Learn more from the following resources:
- [@official@dplyr website](https://dplyr.tidyverse.org/)
- [@official@dplyr](https://dplyr.tidyverse.org/)
- [@video@Dplyr Essentials](https://www.youtube.com/watch?v=Gvhkp-Yw65U)

@ -4,5 +4,5 @@ Dplyr is a powerful and popular toolkit for data manipulation in R. As a data an
Learn more from the following resources:
- [@official@dplyr website](https://dplyr.tidyverse.org/)
- [@official@dplyr](https://dplyr.tidyverse.org/)
- [@video@Dplyr Essentials](https://www.youtube.com/watch?v=Gvhkp-Yw65U)

@ -1,8 +1,8 @@
# ggplot2
# ggplot2
When it comes to data visualization in R programming, ggplot2 stands tall as one of the primary tools for data analysts. This data visualization library, which forms part of the tidyverse suite of packages, facilitates the creation of complex and sophisticated visual narratives. With its grammar of graphics philosophy, ggplot2 enables analysts to build graphs and charts layer by layer, thereby offering detailed control over graphical features and design. Its versatility in creating tailored and aesthetically pleasing graphics is a vital asset for any data analyst tackling exploratory data analysis, reporting, or dashboard building.
Learn more from the following resources:
- [@article@ggplot2 website](https://ggplot2.tidyverse.org/)
- [@official@ggplot2](https://ggplot2.tidyverse.org/)
- [@video@Make beautiful graphs in R](https://www.youtube.com/watch?v=qnw1xDnt_Ec)

@ -1,8 +1,8 @@
# Hadoop
# Hadoop
Hadoop is a critical element in the realm of data processing frameworks, offering an effective solution for storing, managing, and analyzing massive amounts of data. Unraveling meaningful insights from a large deluge of data is a challenging pursuit faced by many data analysts. Regular data processing tools fail to handle large-scale data, paving the way for advanced frameworks like Hadoop. This open-source platform by Apache Software Foundation excels at storing and processing vast data across clusters of computers. Notably, Hadoop comprises two key modules - the Hadoop Distributed File System (HDFS) for storage and MapReduce for processing. Hadoop’s ability to handle both structured and unstructured data further broadens its capacity. For any data analyst, a thorough understanding of Hadoop can unlock powerful ways to manage data effectively and construct meaningful analytics.
Learn more from the following resources:
- [@official@Apache Hadoop Website](https://hadoop.apache.org/)
- [@official@Apache Hadoop](https://hadoop.apache.org/)
- [@article@What Is Hadoop?](https://www.databricks.com/glossary/hadoop)

@ -4,5 +4,5 @@ Heatmaps are a crucial component of data visualization that Data Analysts regula
Learn more from the following resources:
- [@article@A complete guide to heatmaps](https://www.hotjar.com/heatmaps/)
- [@article@What is a heatmap?](https://www.atlassian.com/data/charts/heatmap-complete-guide)
- [@article@A Complete Guide to Heatmaps](https://www.hotjar.com/heatmaps/)
- [@article@What is a Heatmap?](https://www.atlassian.com/data/charts/heatmap-complete-guide)

@ -4,5 +4,5 @@ Image Recognition has become a significant domain because of its diverse applica
Learn more from the following resources:
- [@article@What is image recognition?](https://www.techtarget.com/searchenterpriseai/definition/image-recognition)
- [@article@What is Image Recognition?](https://www.techtarget.com/searchenterpriseai/definition/image-recognition)
- [@article@Image Recognition: Definition, Algorithms & Uses](https://www.v7labs.com/blog/image-recognition-guide)

@ -1,8 +1,8 @@
# Matplotlib
# Matplotlib
For a Data Analyst, understanding data and being able to represent it in a visually insightful form is a crucial part of effective decision-making in any organization. Matplotlib, a plotting library for the Python programming language, is an extremely useful tool for this purpose. It presents a versatile framework for generating line plots, scatter plots, histogram, bar charts and much more in a very straightforward manner. This library also allows for comprehensive customizations, offering a high level of control over the look and feel of the graphics it produces, which ultimately enhances the quality of data interpretation and communication.
Learn more from the following resources:
- [@video@Learn Matplotlib in 6 minutes](https://www.youtube.com/watch?v=nzKy9GY12yo)
- [@article@Matplotlib Website](https://matplotlib.org/)
- [@official@Matplotlib](https://matplotlib.org/)
- [@video@Learn Matplotlib in 6 minutes](https://www.youtube.com/watch?v=nzKy9GY12yo)

@ -4,5 +4,5 @@ Matplotlib is a paramount data visualization library used extensively by data an
Learn more from the following resources:
- [@video@Learn Matplotlib in 6 minutes](https://www.youtube.com/watch?v=nzKy9GY12yo)
- [@article@Matplotlib Website](https://matplotlib.org/)
- [@official@Matplotlib](https://matplotlib.org/)
- [@video@Learn Matplotlib in 6 minutes](https://www.youtube.com/watch?v=nzKy9GY12yo)

@ -2,7 +2,7 @@
The concept of central tendency is fundamental in statistics and has numerous applications in data analysis. From a data analyst's perspective, the central tendencies like mean, median, and mode can be highly informative about the nature of data. Among these, the "Mode" is often underappreciated, yet it plays an essential role in interpreting datasets.
The mode, in essence, represents the most frequently occurring value in a dataset. While it may appear simplistic, the mode's ability to identify the most common value can be instrumental in a wide range of scenarios, like market research, customer behavior analysis, or trend identification. For instance, a data analyst can use the mode to determine the most popular product in a sales dataset or identify the most commonly reported bug in a software bug log.
The mode, in essence, represents the most frequently occurring value in a dataset. While it may appear simplistic, the mode's ability to identify the most common value can be instrumental in a wide range of scenarios, like market research, customer behavior analysis, or trend identification. For instance, a data analyst can use the mode to determine the most popular product in a sales dataset or identify the most commonly reported bug in a software bug log.
Beyond these, utilizing the mode along with the other measures of central tendency (mean and median) can provide a more rounded view of your data. This approach personifies the diversity that's often required in data analytic strategies to account for different data distributions and outliers. The mode, therefore, forms an integral part of the data analyst's toolkit for statistical data interpretation.

@ -4,5 +4,5 @@ As a data analyst, it's crucial to understand various model evaluation technique
Learn more from the following resources:
- [@article@What is model evaluation](https://domino.ai/data-science-dictionary/model-evaluation)
- [@article@Model evaluation metrics](https://www.markovml.com/blog/model-evaluation-metrics)
- [@article@What is Model Evaluation](https://domino.ai/data-science-dictionary/model-evaluation)
- [@article@Model Evaluation Metrics](https://www.markovml.com/blog/model-evaluation-metrics)

@ -4,5 +4,5 @@ Neural Networks play a pivotal role in the landscape of deep learning, offering
Learn more from the following resources:
- [@article@What is a neural network?](https://aws.amazon.com/what-is/neural-network/)
- [@article@What is a Neural Network?](https://aws.amazon.com/what-is/neural-network/)
- [@article@Explained: Neural networks](https://news.mit.edu/2017/explained-neural-networks-deep-learning-0414)

@ -4,5 +4,5 @@ Pandas is a widely acknowledged and highly useful data manipulation library in t
Learn more from the following resources:
- [@official@Pandas Website](https://pandas.pydata.org/)
- [@official@Pandas](https://pandas.pydata.org/)
- [@video@NumPy vs Pandas](https://www.youtube.com/watch?v=KHoEbRH46Zk)

@ -4,5 +4,5 @@ In the realms of data analysis, data cleaning is a crucial preliminary process,
Learn more from the following resources:
- [@official@Pandas Website](https://pandas.pydata.org/)
- [@official@Pandas](https://pandas.pydata.org/)
- [@video@NumPy vs Pandas](https://www.youtube.com/watch?v=KHoEbRH46Zk)

@ -1,8 +1,8 @@
# Pie Chart
# Pie Chart
As a data analyst, understanding and efficiently using various forms of data visualization is crucial. Among these, Pie Charts represent a significant tool. Essentially, pie charts are circular statistical graphics divided into slices to illustrate numerical proportions. Each slice of the pie corresponds to a particular category. The pie chart's beauty lies in its simplicity and visual appeal, making it an effective way to convey relative proportions or percentages at a glance. For a data analyst, it's particularly useful when you want to show a simple distribution of categorical data. Like any tool, though, it's important to use pie charts wisely—ideally, when your data set has fewer than seven categories, and the proportions between categories are distinct.
Learn more from the following resources:
- [@video@What is a a pie chart](https://www.youtube.com/watch?v=GjJdZaQrItg)
- [@article@A complete guide to pie charts](https://www.atlassian.com/data/charts/pie-chart-complete-guide)
- [@video@What is a Pie Chart](https://www.youtube.com/watch?v=GjJdZaQrItg)
- [@article@A Complete Guide to Pie Charts](https://www.atlassian.com/data/charts/pie-chart-complete-guide)

@ -4,6 +4,6 @@ Data Analysts recurrently find the need to summarize, investigate, and analyze t
Learn more from the following resources:
- [@articles@Create a pivot table](https://support.microsoft.com/en-gb/office/create-a-pivottable-to-analyze-worksheet-data-a9a84538-bfe9-40a9-a8e9-f99134456576)
- [@article@Pivot tables in excel](https://www.excel-easy.com/data-analysis/pivot-tables.html)
- [@video@How to create a pivot table in excel](https://www.youtube.com/watch?v=PdJzy956wo4)
- [@articles@Create a Pivot Table](https://support.microsoft.com/en-gb/office/create-a-pivottable-to-analyze-worksheet-data-a9a84538-bfe9-40a9-a8e9-f99134456576)
- [@article@Pivot Tables in Excel](https://www.excel-easy.com/data-analysis/pivot-tables.html)
- [@video@How to Create a Pivot Table in Excel](https://www.youtube.com/watch?v=PdJzy956wo4)

@ -4,5 +4,5 @@ PowerBI, an interactive data visualization and business analytics tool developed
Learn more from the following resources:
- [@official@Power BI Website](https://www.microsoft.com/en-us/power-platform/products/power-bi)
- [@official@Power BI](https://www.microsoft.com/en-us/power-platform/products/power-bi)
- [@video@Power BI for beginners](https://www.youtube.com/watch?v=NNSHu0rkew8)

@ -4,5 +4,5 @@ Predictive analysis is a crucial type of data analytics that any competent data
Learn more from the following resources:
- [@video@What is predictive analytics?](https://www.youtube.com/watch?v=cVibCHRSxB0)
- [@article@What is predictive analytics? - Google](https://cloud.google.com/learn/what-is-predictive-analytics)
- [@video@What is Predictive Analytics?](https://www.youtube.com/watch?v=cVibCHRSxB0)
- [@article@What is Predictive Analytics? - Google](https://cloud.google.com/learn/what-is-predictive-analytics)

@ -4,5 +4,6 @@ PyTorch, an open-source machine learning library, has gained considerable popula
Learn more from the following resources:
- [@official@PyTorch Website](https://pytorch.org/)
- [@official@PyTorch](https://pytorch.org/)
- [@official@PyTorch Documentation](https://pytorch.org/docs/stable/index.html)
- [@video@PyTorch in 100 seconds](https://www.youtube.com/watch?v=ORMx45xqWkA)

@ -4,4 +4,4 @@ The concept of Range refers to the spread of a dataset, primarily in the realm o
Learn more from the following resources:
- [@article@How to find the range of a data set](https://www.scribbr.co.uk/stats/range-statistics/)
- [@article@How to Find the Range of a Data Set](https://www.scribbr.co.uk/stats/range-statistics/)

@ -6,5 +6,5 @@ A data analyst leveraging RNNs can effectively charter the intrinsic complexity
Learn more from the following resources:
- [@article@What is a recurrent neural network (RNN)?](https://www.ibm.com/topics/recurrent-neural-networks)
- [@article@Recurrent Neural Networks cheatsheet](https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks)
- [@article@What is a Recurrent Neural Network (RNN)?](https://www.ibm.com/topics/recurrent-neural-networks)
- [@article@Recurrent Neural Networks Cheat-sheet](https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks)

@ -4,5 +4,5 @@ A scatter plot, a crucial aspect of data visualization, is a mathematical diagra
Learn more from the following resources:
- [@article@Mastering scatter plots](https://www.atlassian.com/data/charts/what-is-a-scatter-plot)
- [@article@Mastering Scatter Plots](https://www.atlassian.com/data/charts/what-is-a-scatter-plot)
- [@video@Scatter Graphs: What are they and how to plot them](https://www.youtube.com/watch?v=Vyg9qmBsgAc)

@ -4,5 +4,5 @@ Seaborn is a robust, comprehensive Python library focused on the creation of inf
Learn more from the following resources:
- [@official@Seaborn Website](https://seaborn.pydata.org/)
- [@official@Seaborn](https://seaborn.pydata.org/)
- [@video@Seaborn Tutorial : Seaborn Full Course](https://www.youtube.com/watch?v=6GUZXDef2U0)

@ -4,5 +4,5 @@ As a big data processing framework, Apache Spark showcases immense importance in
Learn more from the following resources:
- [@official@Apache Spark Website](https://spark.apache.org/)
- [@official@Apache Spark](https://spark.apache.org/)
- [@opensource@apache/spark](https://github.com/apache/spark)

@ -4,5 +4,5 @@ A stacked chart is an essential tool for a data analyst in the field of data vis
Learn more from the following resources:
- [@article@What is a stacked chart?](https://www.spotfire.com/glossary/what-is-a-stacked-chart)
- [@article@What is a Stacked Chart?](https://www.spotfire.com/glossary/what-is-a-stacked-chart)
- [@article@A Complete Guide to Stacked Bar Charts](https://www.atlassian.com/data/charts/stacked-bar-chart-complete-guide)

@ -2,4 +2,7 @@
Statistical analysis is a core component of a data analyst's toolkit. As professionals dealing with vast amount of structured and unstructured data, data analysts often turn to statistical methods to extract insights and make informed decisions. The role of statistical analysis in data analytics involves gathering, reviewing, and interpreting data for various applications, enabling businesses to understand their performance, trends, and growth potential. Data analysts use a range of statistical techniques from modeling, machine learning, and data mining, to convey vital information that supports strategic company actions.
Learn more from the following resources:
Learn more from the following resources:
- [@article@Understanding Statistical Analysis](https://www.simplilearn.com/what-is-statistical-analysis-article)
- [@video@Statistical Analysis](https://www.youtube.com/watch?v=XjMBZE1DuBY)

@ -4,5 +4,5 @@ Supervised machine learning forms an integral part of the toolset for a Data Ana
Learn more from the following resources:
- [@article@What is supervised learning?](https://cloud.google.com/discover/what-is-supervised-learning)
- [@article@What is Supervised Learning?](https://cloud.google.com/discover/what-is-supervised-learning)
- [@article@Supervised Machine Learning](https://www.datacamp.com/blog/supervised-machine-learning)

@ -4,5 +4,5 @@ Tableau is a powerful data visualization tool utilized extensively by data analy
Learn more from the following resources:
- [@official@Tableau Website](https://www.tableau.com/en-gb)
- [@official@Tableau](https://www.tableau.com/en-gb)
- [@video@What is Tableau?](https://www.youtube.com/watch?v=NLCzpPRCc7U)

@ -4,5 +4,6 @@ TensorFlow, developed by Google Brain Team, has become a crucial tool in the rea
Learn more from the following resources:
- [@official@Tensorflow Website](https://www.tensorflow.org/)
- [@official@Tensorflow](https://www.tensorflow.org/)
- [@official@Tensorflow Documentation](https://www.tensorflow.org/learn)
- [@video@Tensorflow in 100 seconds](https://www.youtube.com/watch?v=i8NETqtGHms)

@ -1,20 +1,17 @@
# Introduction to Types of Data Analytics
Data Analytics has proven to be a critical part of decision-making in modern business ventures. It is responsible for discovering, interpreting, and transforming data into valuable information. Different types of data analytics look at past, present, or predictive views of business operations.
Data Analytics has proven to be a critical part of decision-making in modern business ventures. It is responsible for discovering, interpreting, and transforming data into valuable information. Different types of data analytics look at past, present, or predictive views of business operations.
Data Analysts, as ambassadors of this domain, employ these types, to answer various questions:
Data Analysts, as ambassadors of this domain, employ these types, to answer various questions:
- Descriptive Analytics *(what happened in the past?)*
- Diagnostic Analytics *(why did it happened in the past?)*
- Predictive Analytics *(what will happen in the future?)*
- Prescriptive Analytics *(how can we make it happen?)*
Understanding these types gives data analysts the power to transform raw datasets into strategic insights.
Visit the following resources to learn more:
- [@article@Data Analytics and its type](https://www.geeksforgeeks.org/data-analytics-and-its-type/)
- [@article@The 4 Types of Data Analysis: Ultimate Guide](https://careerfoundry.com/en/blog/data-analytics/different-types-of-data-analysis/)
- [@video@Descriptive vs Diagnostic vs Predictive vs Prescriptive Analytics: What's the Difference?](https://www.youtube.com/watch?v=QoEpC7jUb9k)
- [@video@Types of Data Analytics](https://www.youtube.com/watch?v=lsZnSgxMwBA)

@ -4,5 +4,5 @@ Unsupervised learning, as a fundamental aspect of Machine Learning, holds great
Learn more from the following resources:
- [@article@What is unsupervised learning?](https://cloud.google.com/discover/what-is-unsupervised-learning)
- [@article@Introduction to unsupervised learning](https://www.datacamp.com/blog/introduction-to-unsupervised-learning)
- [@article@What is Unsupervised Learning?](https://cloud.google.com/discover/what-is-unsupervised-learning)
- [@article@Introduction to Unsupervised Learning](https://www.datacamp.com/blog/introduction-to-unsupervised-learning)

@ -4,5 +4,5 @@ Data analysts heavily rely on statistical concepts to analyze and interpret data
Learn more from the following resources:
- [@article@What is variance?](https://www.investopedia.com/terms/v/variance.asp)
- [@article@How to calculate variance](https://www.scribbr.co.uk/stats/variance-meaning/)
- [@article@What is Variance?](https://www.investopedia.com/terms/v/variance.asp)
- [@article@How to Calculate Variance](https://www.scribbr.co.uk/stats/variance-meaning/)

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save