Загрузка...

Case study on Normalisation| Machine Learning| SNS institutions

#snsinstitutions #snsdesignthinkers #designthinking in this video describes about
Normalization, in the context of databases and machine learning, refers to the process of organizing and structuring data to minimize redundancy and improve data integrity. In database design, it involves breaking down tables to reduce anomalies and enhance data consistency. In machine learning, it's a preprocessing step that scales data to a similar range, often used for feature scaling.
Database Normalization:
Purpose:
To eliminate redundancy and inconsistencies in a relational database.
Process:
Involves creating smaller tables with defined relationships and following specific normal forms (1NF, 2NF, 3NF, etc.).
Benefits:
Reduces storage space, improves data consistency, prevents update anomalies, and makes data easier to manage.
Advantages of Normalization
Normalization eliminates data redundancy and ensures that each piece of data is stored in only one place, reducing the risk of data inconsistency and making it easier to maintain data accuracy.
By breaking down data into smaller, more specific tables, normalization helps ensure that each table stores only relevant data, which improves the overall data integrity of the database.
Normalization simplifies the process of updating data, as it only needs to be changed in one place rather than in multiple places throughout the database.
Normalization enables users to query the database using a variety of different criteria, as the data is organized into smaller, more specific tables that can be joined together as needed.
Normalization can help ensure that data is consistent across different applications that use the same database, making it easier to integrate different applications and ensuring that all users have access to accurate and consistent data.
Disadvantages of Normalization
Normalization can result in increased performance overhead due to the need for additional join operations and the potential for slower query execution times.
Normalization can result in the loss of data context, as data may be split across multiple tables and require additional joins to retrieve.
Proper implementation of normalization requires expert knowledge of database design and the normalization process.
Normalization can increase the complexity of a database design, especially if the data model is not well understood or if the normalization process is not carried out correctly.

Видео Case study on Normalisation| Machine Learning| SNS institutions канала S.Saranya SNS
Страницу в закладки Мои закладки
Все заметки Новая заметка Страницу в заметки

На информационно-развлекательном портале SALDA.WS применяются cookie-файлы. Нажимая кнопку Принять, вы подтверждаете свое согласие на их использование.

Об использовании CookiesПринять