Database Normalization ABSTRACT Databases are designed to exploit the relationships between data in its records to provide useful information that supports the needs of the individual or...

i have i proved the paper i need help in implementation part more


Database Normalization ABSTRACT Databases are designed to exploit the relationships between data in its records to provide useful information that supports the needs of the individual or organizations. The data may occur as several duplicate records, which lead to redundancy in the database and may affect the integrity of information drawn from the database. The database normalization process aims to determine the degree of redundancy in the database. Two goals are considered when performing the database normalization process: fully describing the level of redundancy in the relational database model, and formulating a strategy for converting the database model to reduce the existing redundancy. This paper explores the concept of database normalization and the challenges experienced by students and industry professionals in developing efficient databases.  KEYWORDS Database, Gamification, Relational Database Management System (RDBMS), Functional Dependencies, Normalization, Relationships, Relational Model. INTRODUCTION Normalization is theory in relational databases that provides guidelines on the efficient organization of data in relational databases [1]. The process of database normalization seeks to determine the extent of redundancy existing in a database. The database normalization process is carried out with two goals in mind: sufficiently describe the level of redundancy that exist in a relational database schema, and develop strategies for transforming database schema to reduce the existing redundancies. By performing database normalization with these objectives, database developers can reduce the size of storage consumed by the database and ensure that data stored in the database is logically organized [1].  It is a challenge to design and build an efficient Relational Database Management System (RDBMS), but the process and effort is satisfying and rewarding when the results are delivered. Most problems in information management are caused by uncontrolled redundancy of data in the database, which then creates a problem with integrity of the data contained in the database [2].  To overcome this issue, databases, which contain very large tables, are broken down into smaller tables to separate the redundant data through the database normalization process. The smaller tables are then linked to each other using relational schema.  EXISTING WORK We start off our paper with the article “An Ameliorated Approach towards Automating the Normalization Process“. The purpose of this article is to create a learning system to teach employees about the normalization process in a quick and easy manner. An example of this is with dependencies in a table. A student may need a student number, a course may need a course number, etc. To eliminate redundancies in our tables, we should create our own separate tables for attributes that may be recurring. This way we eliminate excessive data. A new context to database normalization is explored. The long-term debt involved with the degradation of the database that’s due to a lack of normalization. Debt is especially transparent in tables below the 4th normal form. Having the 4th normal form for all tables in your database would be idealistic but also very expensive. Performance degradation is estimated using an Input/Output method that estimates the cost of each table. Using these calculations we can determine whether or not normalizing our tables would be the most cost efficient approach. The article “Normalization: Some Generalities”, discusses separating redundant data into separate tables using a lossless method (which means it doesn’t lose any data). Reducing the same data appearing multiple times. This separation of data is known as a “two-relvar” design. This “two-relvar” design is more efficient, and capable of representing certain information that the original relvar design isn’t. The purpose of normalization is to reduce data redundancy, inconsistency, and maintain the atomicy between the relation and the database. In the model this article proposes it has a table with 1 column and 3 rows. The tabular approach uses dependencies in a database to determine the keys in said database. Once the keys were determined you could then use them to reach a higher level of relational database design. In the following paper “Relational database normalization”, they have improved and proposed a new normal form which, according to the principles of representation and separation, supplies a better basis for schema design than do 3NF or BCNF.  A new normal form was then proposed which combines the salient qualities of both BCNF and 3NF. algorithm proposed in for the design of 3NF schema does in fact produce EKNF schema. redundancy using 3NF simplification of complex systems through such a natural method as drawing the system to be analyzed in graphs, which respect Armstrong's axioms, considering qualifiers for subsets or properties; while it has a high analytical power and various forms of use, its use depending mainly on the nature of the problem. methods are compatible with computational data structures, which could allow their coding to create systems that automate the analysis, also through symbol manipulation (rewriting systems). The article “A Conceptual Model of Database Normalization Courseware Using Gamification Approach”, proposed a conceptual model of gamification for learning database normalization which will improve the work. The concept of database normalization and problems in learning on this topic are combined together and matched with gamification features to produce a game. database normalization courseware will be developed based on the proposed model and will be tested to evaluate effectiveness of the model. Databases and Blockchain are proposed. database systems have been used for storing transaction data of blockchain - traditional databases, which are used or can be used in the design of blockchain platforms or applications, different decentralized solutions that uses traditional databases and provides blockchain-enabled solutions. Describe how to use redundancy functions -three types of anomalies that occur when the database is not normalized -Insertion, update and deletion anomaly. What are normal forms and how many are there? How are the normal used by those anomalies? how the CAP (Consistency, Availability, Partition tolerance) theorem known for databases influenced the DCS (Decentralization, Consistency, Scalability) theorem for the blockchain systems, using an analogous relaxation approach as it was used for the proof of the CAP theorem, and  postulate a ”DCS-satisfiability conjecture.”  reviews different databases that are designed specifically for blockchain and provide most of the blockchain functionality like immutability, privacy, censorship resistance, along with database features. Normal forms involve, set off dependency properties that a schema must satisfy and each normal form gives guarantees about the presence and/or absence of update anomalies. This means that higher normal forms have less redundancy, and as a result, fewer update problems. A proposed system for teaching college-level students the basic concepts of database normalization involving an algorithmic, pseudo-code approach to explaining the mechanism of normalization and a simplistic evolving spreadsheet to actually show the effects of normalization in an organized and conceptual manner. A research project attempting to quantify the technical cost of running a database in various levels of normal form. The project describes the issue of creating tables that do not adhere to fourth normal form and makes use of the term “technical debt” to explain how administrators can track the efficiency of databases and the detriments of having a table that only reaches first, second, or third normal form. It then explains, in a technical manner, the significance of normal forms in database systems. A journal-published article describing the benefits of database normalization. The article suggests three major variables to consider when designing a database: the amount of anomalies, the time needed for a transaction, and the storage requirements. After establishing the significance of these factors, the article describes a simple decision tree through which one can analyze and more effectively optimize the database’s normalization.  The first 3 normal forms are based on primary keys.  The second and third normal forms would have candidates for multiple keys. Relational databases use two different types of levels, logical and storage levels. Relational databases use the normal forms through algorithms. In these algorithms each of the individual entities are described using a tuple. In normalization we try to reduce the redundancies in our databases. The reason being, the redundancies can cause complications when we use the Insert, Delete, or Update anomalies. If a database is not normalized, updating one aspect of the database may alter the data of unwanted data types. Separating information into separate tables can help reduce redundancy as well. You also want to ensure that none of your values are null. You also want to make sure that if you have to compress your data, that you use lossless compression rather than lossy compression. The reason for this is because the lossless will not lose any of the bits in the process, whereas the lossy will. Constraints also play a role in normalization. As you normalize your database, adding crucial constraints to your database will help determine the meaning and define the interrelationships between data. Normalization helps break down bad relation’s attributes into tinier relations. The 4th normal form is primarily based on keys and multivalued dependencies. The 5th normal form is primarily based on keys and joined dependencies. Usually a database will consider noemalization, once the utility of the constraints of the tables becomes questionable. Most database designers don’t go past the third normal form. Any schema having more than one key is called candidate keys. The first normal form does not allow composite attributes, multivalued attributes, or nested relations. The second normal form uses FDs as well as primary keys. The third normal form has all the attributes relying on just the key. BCNF is considered stronger than the third normal form. The goal is to ultimately have the databases in the third normal form transition into the BCNF. The article “Normalization: Some Generalities”, discusses separating redundant data into separate tables using a lossless method (which means it doesn’t lose any data). Reducing the same data appearing multiple times. This separation of data is known as a “two-relvar” design. This “two-relvar” design is more efficient, and capable of representing certain information that the original relvar design isn’t. The purpose of normalization is to reduce data redundancy, inconsistency, and maintain the atomicy between the relation and the database. In the model this article proposes it has a table with 1 column and 3 rows. The tabular approach uses dependencies in a database to determine the keys in said database. Once the keys were determined you could then use them to reach a higher level of relational database design. In the following paper “Relational database normalization”, they have improved and proposed a new normal form which, according to the principles of representation and separation, supplies a better basis for schema design than do 3NF or BCNF.  A new normal form was then proposed which combines the salient qualities of both BCNF and 3NF. algorithm proposed in for the design of 3NF schema does in fact produce EKNF schema. redundancy using 3NF simplification of complex systems through such a natural method as drawing the system to be analyzed in graphs, which respect Armstrong's axioms, considering qualifiers for subsets or properties; while it has a high analytical power and various forms of use, its use depending mainly on the nature of
Apr 03, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here