Want create site? With Free visual composer you can do it easy.

HCI 660 Topic 5 Data Processing Assignment

HCI 660 Topic 5 Data Processing Assignment

Processing of Information
The conversion of data into the intended and usable form is referred to as data processing. Processing or conversion is frequently carried out by applying a set of specified processes, either automatically or manually. The majority of data processing is done on computers, thus it is always done automatically. Abstraction, reconciliation, and data normalization or normalizing are some of the methodologies that can be employed in data processing. The above approaches to data processing are always fraught with difficulties. The methods or approaches that have been adopted have a lot to do with the issues that have been encountered. As a result, the tactics that have been taken as well as the plans that have been put in place are critical to overcoming these issues.
Abstraction of data

The process of exploring or examining clinical medical records, whether paper or electronic, to determine or identify the required secondary data for use is known as clinical data abstraction. The process of data abstraction frequently leads in a summary of patient information for secondary use (van Cruchten & Weigand, 2018). Data is extracted from clinical records by directly matching information or data in the medical records to the required data elements. Additional actions on the data, such as coding, categorizing, summarizing, interpretation, and calculating, are also used to abstract data (Wang et al., 2018). The complexity of the databases and procedures that must be followed is one of the challenges connected with data abstraction.

The Data Normalization Process

Data normalization is the process of structuring or arranging data in a database. The entire procedure include generating tables and establishing relationships between them in accordance with rules aimed to make the database more adaptable while also protecting the data and information. Data normalization also entails removing inconsistencies in dependency and redundancy (Jungherr, 2018). Most of the time, redundant data wastes disk space, causing maintenance issues. Data normalization is critical for assuring access efficiency as well as the establishment of additional storage systems. The presence of duplication is one of the challenges connected with data standardization.

Click here to ORDER an A++ paper from our Verified MASTERS and DOCTORATE WRITERS: HCI 660 Topic 5 Data Processing Assignment

Reconciling Data is a process that involves bringing disparate data sets together.

The verification steps during data migration are part of the data reconciliation process. Throughout the process, the destination data is compared to the source data to ensure that the migration architecture is transferring accurate information (Cong et al., 2018). The application of technology that has been created with mathematical models to successfully process information is called data reconciliation and validation. There are always models, measurements, and a data reconciliation algorithm in the reconciliation process, which leads to reconciled data. One of the difficulties in data reconciliation is the availability of many data structures that can be difficult to compare.

HCI 660 Topic 5 Data Processing Assignment

The Difficulties of Using Data from Multiple Sources

Integrating or combining data from several sources, which may include unstructured, semi-structured, and structured data, can be a difficult task. As a result, sophisticated techniques are constantly being developed and applied, which may or may not result in correct conclusions (Belkin et al., 2019). Diverse types of data are contained in different sources of information, and they may need to be cleaned or normalized into the needed format. The application of data from many sources takes time because database administrators must rearrange all of the data to ensure that it fits into the new system. Finally, combining data from several sources may be linked to insecure behaviors. To put it another way, it’s easier to tamper with data that’s already clean and ready to go.

Also Read: What is the purpose of looking at different types of data?

References

M. Belkin, D. Hsu, S. Ma, and S. Mandal (2019). Modern machine-learning practice and the classic bias–variance trade-off are reconciled. 15849-15854 in Proceedings of the National Academy of Sciences, vol. 116, no. 32. https://doi.org/10.1073/pnas.1903070116

P. T. Cong, N. T. Toan, N. Q. V. Hung, and B. Stantic (2018, June). Reducing the amount of time it takes to reconcile data from participatory sensing. The 8th International Conference on Web Intelligence, Mining, and Semantics has published its proceedings (pp. 1-11). https://dl.acm.org/doi/abs/10.1145/3227609.3227678

A. Jungherr, A. Jungherr, A. Jungherr, A (2018). Digitizing trace data and normalizing it. Discussions on the Internet (pp. 9-35). Routledge. https://www.taylorfrancis.com/chapters/edit/10.4324/9781351209434-2/normalizing-digital-trace-data-andreas-jungherr

R. R. van Cruchten and H. H. Weigand (2018, May). The necessity for rule-based data abstraction in logistics process mining. The 12th International Conference on Research Challenges in Information Science (RCIS) took place in 2018. (pp. 1-9). https://ieeexplore.ieee.org/abstract/document/8406653 IEEE.

W. Wang, J. Liu, G. Pitsilis, and X. Zhang (2018). In computer networks, huge data is abstracted for lightweight intrusion detection. 417-430 in Information Sciences. https://doi.org/10.1016/j.ins.2016.10.023

Did you find apk for android? You can find new Free Android Games and apps.