Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Using Fuzzy Logic to Evaluate Normalization Completeness for An Improved Database Design (1204.0176v1)

Published 1 Apr 2012 in cs.DB

Abstract: A new approach, to measure normalization completeness for conceptual model, is introduced using quantitative fuzzy functionality in this paper. We measure the normalization completeness of the conceptual model in two steps. In the first step, different normalization techniques are analyzed up to Boyce Codd Normal Form (BCNF) to find the current normal form of the relation. In the second step, fuzzy membership values are used to scale the normal form between 0 and 1. Case studies to explain schema transformation rules and measurements. Normalization completeness is measured by considering completeness attributes, preventing attributes of the functional dependencies and total number of attributes such as if the functional dependency is non-preventing then the attributes of that functional dependency are completeness attributes. The attributes of functional dependency which prevent to go to the next normal form are called preventing attributes.

Citations (5)

Summary

We haven't generated a summary for this paper yet.