Batch Processing |
Processing data in large blocks at scheduled intervals. |
Business Intelligence |
A technology-driven process for analyzing data and presenting actionable information to help corporate executives, business managers, and other end-users make more informed business decisions. |
Business Intelligence (BI) |
A set of technologies and processes for analyzing data and presenting actionable information to help executives, managers, and other corporate end users make informed business decisions. |
Change Data Capture |
The process of identifying and capturing changes made to data in real time, enabling real time updates to data warehouses. |
Dashboard |
A visual display that presents key information and metrics in an easily digestible format, allowing users to monitor performance and track progress towards goals. |
Data Analysis |
The process of inspecting, cleaning, transforming, and modeling data to discover useful information that can support decision-making in business intelligence. |
Data Cleansing |
The process of identifying and correcting or removing errors, inconsistencies, and inaccuracies in data stored in the data warehouse. |
Data Latency |
The delay between data being generated and being available for reporting and analysis in a data warehouse. |
Data Mart |
A smaller, specialized subset of a data warehouse that is focused on a particular business function or department. |
Data Mining |
The process of discovering patterns, insights, and valuable information from large datasets stored in the data warehouse. |
Data Visualization |
The graphical representation of information and data, using visual elements such as charts, graphs, and maps, to facilitate understanding and decision-making in business intelligence. |
Data Warehousing |
The process of collecting, organizing, and storing data to be retrieved and analyzed later. |
Denormalization |
A database design technique that reduces the need for frequent joins by consolidating tables, thus improving query performance. |
ETL |
The process of extracting data from various sources, transforming it to fit the data warehouse schema, and loading it into the data warehouse. |
ETL Tool |
A software tool or platform that facilitates the automation and management of the Extract, Transform, Load process. |
Event Driven Architecture |
A design pattern where the flow of the application is determined by events that occur, rather than being controlled by a central program flow. |
Extract |
The first step in the Extract, Transform, Load (ETL) process, which involves retrieving data from various sources, such as databases, files, or APIs. |
Extraction |
The process of retrieving or pulling data from the source systems or applications. |
Fact Table |
A central table in a data warehouse that contains the primary measures or metrics of a business process. |
Load |
The third and final step in the ETL process, which involves transferring the transformed data into the target database, data warehouse, or application. |
Micro-Batch Processing |
Where data is processed in small, fixed-size batches rather than processing the data all at once. |
OLAP |
The capability of a system to provide multidimensional analysis of data in a data warehouse. |
OLTP Database |
A traditional database system that prioritizes data integrity and normalization to minimize redundancy. |
Predictive Analytics |
Using statistical techniques and machine learning algorithms to analyze current and historical data to make predictions about future events and outcomes in business intelligence. |
Real Time Updates |
The process of updating data in a data warehouse as soon as new information becomes available. |
Real-Time Analytics |
The analysis of data as soon as it is acquired, often used to make immediate decisions or respond to events as they happen. |
Time-Stamped Data |
Data entries associated with specific time periods, allowing for comparisons across different time frames. |
Transform |
The second step in the ETL process, which involves converting and restructuring the extracted data into a format suitable for the destination system or application. |
Unnormalized Database |
A database design approach that intentionally introduces data redundancy to improve query efficiency. |
Validation |
The process of ensuring the accuracy, completeness, and quality of the extracted and transformed data. |