asdf10

All of the following are ways to consolidate data EXCEPT:__
data rollup and integration.
Data governance can be defined as:
high-level organizational groups and processes that oversee data stewardship.
One characteristic of quality data which pertains to the expectation for the time between when data are expected and when they are available for use is:
timeliness
Converting data from the format of its source to the format of its destination is called:
data transformation.
External data sources present problems for data quality because:
there is a lack of control over data quality.
Conformance means that:
data are stored, exchanged or presented in a format that is specified by its metadata.
Including data capture controls (i.e., dropdown lists) helps reduce ________ deteriorated data problems.
data entry
The process of combining data from various sources into a single table or view is called:
joining
Informational and operational data differ in all of the following ways EXCEPT:
level of detail.
A technique using artificial intelligence to upgrade the quality of raw data is called:
data scrubbing
TQM stands for:
Total Quality Management.
Quality data can be defined as being:
unique
Data quality is important for all of the following reasons EXCEPT:
it provides a stream of profit.
A technique using pattern recognition to upgrade the quality of raw data is called:
data scrubbing.
The process of transforming data from a detailed to a summary level is called:
aggregating
All of the following are tasks of data cleansing EXCEPT:
creating foreign keys.
Data that are accurate, consistent, and available in a timely fashion are considered:
high-quality.
A characteristic of reconciled data that means the data reflect an enterprise-wide view is:
comprehensive
The major advantage of data propagation is
real-time cascading of data changes throughout the organization.
User interaction integration is achieved by creating fewer ________ that feed different systems.
user interfaces
Event-driven propagation:
pushes data to duplicate sites as an event occurs.
In the ________ approach, one consolidated record is maintained from which all applications draw data.
persistent
________ duplicates data across databases.
Data propagation
One way to improve the data capture process is to:
check entered data immediately for quality against data in the database
Which of the following are key steps in a data quality program?
Apply TQM principles and practices.
The methods to ensure the quality of data across various subject areas are called:
Master Data Management.
Datatype conflicts is an example of a(n) ________ reason for deteriorated data quality.
external data source.
hich type of index is commonly used in data warehousing environments?
Bit-mapped index
Data quality problems can cascade when:
data are copied from legacy systems.
In the ________ approach, one consolidated record is maintained, and all applications draw on that one actual “golden” record.
persistent
Data federation is a technique which:
provides a virtual view of integrated data without actually creating one centralized database.
One simple task of a data quality audit is to:
statistically profile all files.
A method of capturing only the changes that have occurred in the source data since the last capture is called ________ extract.
incremental
Which of the following is a basic method for single field transformation?
Table lookup
Loading data into a data warehouse does NOT involve:
formatting the hard drive.
Getting poor data from a supplier is a(n) ________ reason for deteriorated data quality.
external data source.
All of the following are popular architectures for Master Data Management EXCEPT:
Normalization
Data quality ROI stands for:
risk of incarceration.
The best place to improve data entry across all applications is:
in the database definitions.
An approach to filling a data warehouse that employs bulk rewriting of the target data periodically is called:
refresh mode.