Data Warehousing | ETL |
Fact Table | Data Mart |
OLAP | Data Mining |
Data Cleansing | Business Intelligence |
The process of extracting data from various sources, transforming it to fit the data warehouse schema, and loading it into the data warehouse. | The process of collecting, organizing, and storing data to be retrieved and analyzed later. |
A smaller, specialized subset of a data warehouse that is focused on a particular business function or department. | A central table in a data warehouse that contains the primary measures or metrics of a business process. |
The process of discovering patterns, insights, and valuable information from large datasets stored in the data warehouse. | The capability of a system to provide multidimensional analysis of data in a data warehouse. |
A technology-driven process for analyzing data and presenting actionable information to help corporate executives, business managers, and other end-users make more informed business decisions. | The process of identifying and correcting or removing errors, inconsistencies, and inaccuracies in data stored in the data warehouse. |
Data Analysis | Dashboard |
Predictive Analytics | Data Visualization |
Extract | Transform |
Load | Extraction |
A visual display of key performance indicators (KPIs) and other important business metrics, providing a real-time snapshot of the organization's performance. | The process of inspecting, cleaning, transforming, and modeling data to discover useful information that can support decision-making in business intelligence. |
The graphical representation of information and data, using visual elements such as charts, graphs, and maps, to facilitate understanding and decision-making in business intelligence. | Using statistical techniques and machine learning algorithms to analyze current and historical data to make predictions about future events and outcomes in business intelligence. |
The second step in the ETL process, which involves converting and restructuring the extracted data into a format suitable for the destination system or application. | The first step in the Extract, Transform, Load (ETL) process, which involves retrieving data from various sources, such as databases, files, or APIs. |
The process of retrieving or pulling data from the source systems or applications. | The third and final step in the ETL process, which involves transferring the transformed data into the target database, data warehouse, or application. |
Validation | ETL Tool |
Real Time Updates | Change Data Capture |
Batch Processing | Data Latency |
Event Driven Architecture | Micro-Batch Processing |
A software tool or platform that facilitates the automation and management of the Extract, Transform, Load process. | The process of ensuring the accuracy, completeness, and quality of the extracted and transformed data. |
The process of identifying and capturing changes made to data in real time, enabling real time updates to data warehouses. | The process of updating data in a data warehouse as soon as new information becomes available. |
The delay between data being generated and being available for reporting and analysis in a data warehouse. | Processing data in large blocks at scheduled intervals. |
Where data is processed in small, fixed-size batches rather than processing the data all at once. | A design pattern where the flow of the application is determined by events that occur, rather than being controlled by a central program flow. |
Real-Time Analytics | |
The analysis of data as soon as it is acquired, often used to make immediate decisions or respond to events as they happen. | |