An interview with Yurii Parkhuts, a System Architect at Forbytes
Welcome to the interview with our system architect, dedicated to sharing his vision of project success and uncovering the secrets of effective data modeling.
Forbytes is on a constant quest to streamline the efficiency of data modeling, aiming to assist businesses in securing project excellence. What’s more, we aim to help companies understand the relations between different datasets. This will enable them to make better decisions and achieve their business objectives.
That’s why we’re ready to share the insights we’ve gathered. In this interview, our system architect, Yurii Parkhuts, will reveal how Forbytes approaches data modeling in the context of product development and explain the criteria for building accurate data models.
Please note that this interview has been edited and condensed for clarity.
What Is Data Modeling?
A data model is a map of an actual data structure. It shows business concepts and how they are related to each other.
In software product development, data modeling is a crucial and obligatory step. It helps companies move from raw data (concepts and ideas) to data products and valuable insights. At first, we should think and brainstorm various ideas. Then, we should visualize them by transforming them into structured datasets.
Data modeling starts during our first client meeting. We should analyze the client’s business, understand their problems, and reveal requirements. Then, all collected data can be used for writing detailed project specifications. Not without a reason, the deeper we dive into a client’s business context, the better we can model their data, using them with maximum benefits for a final solution.
What are the types of data modeling?
Data models can be conceptual, logical, and physical. But all these types are interrelated and interdependent. For instance, skipping conceptual modeling and starting with logical modeling is not recommended. This can badly influence the understanding of data structure. That’s why it’s necessary not to view conceptual, logical, and physical data modeling as distinct stages. Rather view them as components of a logical chain.
Conceptual data modeling is built based on information gathered from the client. This is the most abstract level of data modeling. At this stage, it’s important to team up with a client to define key concepts and understand their role and relations.
Logical data modeling is developed based on a conceptual data model. During this stage, we extend our model with more details like entity attributes and relationships between them, establish logical rules, and more. This phase of data modeling is vital for making changes and optimization. Paying attention to small details becomes essential as they assist in explaining the conceptual data.
Physical data modeling involves the actual implementation and application of a logical data model. During this stage, it’s time to decide what database management system will be used, consider data types and relationships between tables, and define primary and foreign keys.
What’s more, at this phase, you should also think about correctly building indices in each table to gain optimal performance for Create Read Update Delete operations. Finally, it’s necessary to establish constraints and logical rules.
But this is not the end of the data modeling process. Even after completing all these stages, there is often a need for subsequent changes or optimizations.
How Forbytes Approaches Data Modeling
But, let’s explore Forbytes’ on-point story to understand how these data modeling types work in practice. One of the relatable cases here is when we teamed up with Stenströms, an elite fashion brand, well-known in Europe and North America, boasting a wide distribution network.
Stenströms specializes in crafting and designing men’s shirts. Selling shirts may seem common, but this brand is unique. It personalizes its approach to each client. Customers can pick up their shirts according to their body parameters, fabric quality, style, cuffs, buttons, and collar type. This allows each customer to become a designer of their shirt somehow.
So imagine how much data a company should gather about every client. And as it was a premium segment, they used to do it manually. The company succeeded in the market. But soon they faced a dilemma on how to digitize their system to speed up work with every client and minimize the probability of errors. That’s why they reached Forbytes to resolve this dilemma.
Stenströms focused on the body measurements and individual preferences of every customer. So, our main and challenging task was to structure all this data into a logical system, serving as a basis for building software.
At a conceptual level, our task was to understand our client’s business works, reveal our client’s problems, and define key business concepts used for data modeling. The main entities here were customer, order, and order details. Order details included such entities as fabrics, colors, patterns, styles of shirts, and customer body measurements. Additionally, they specified collar types, buttons, cuffs, and sleeve lengths.
By defining all these concepts together with our client, we understood what business entities we had. We also learned their attributes and how to build relations between them.
At a logical level, we focused more on entity attributes and relationships between them, business entities, and constraints. We needed these to produce a quality product. For example, the fabric entity had such attributes as name, color, pattern, price, etc. The shirt model had attributes that established relations with other entities’ identities like fabric, style, collar type, cuffs, buttons, sleeve length, and body measurements. Also, we defined a set of logical rules and constraints for the entities.
For example, not all styles of shirts allow using any fabrics and ignore such details as collar type or sleeve length. If a shirt is a slim fit, there should be constraints on collar types. Some shirts can be only with short sleeves.
That’s why constraints were needed to ensure the integrity and consistency of data. What’s more, constraints assisted in creating a truly unique product.
At this stage, we also created a rule engine to establish body parameters according to every style of shirt. But there were some limits to parameters that couldn’t be violated. This step was necessary to follow proportion. What’s more, we also fixed how these personal measurements differed from standard size.
At the physical level of data modeling, we opted for Microsoft Azure as the hosting solution for our client’s platform and the Microsoft SQL database as a data storage for enhanced functionality and compatibility.
What’s more, at this phase, we also built tables for each entity with their attributes, defined data types, and established relationships between them. For example, there were tables with customers, orders, order lines, shirts, different fabrics, shirt styles, collar types, cuffs, buttons, and more.
For every table, we defined primary keys for entity identification and foreign keys to set up relationships between entities (many-to-many, one-to-many, and one-to-one). At this stage, we also created table indices for faster processing and optimization. And defined constraints within tables to ensure data uniqueness and integrity.
The final stage of data modeling was optimization and extension. As our client created new shirt types with specific rules and asked to add extra attributes to different entities, these changes were easily transferred into the database model created with flexibility and extensibility in mind. So, you see, data modeling is a continued process that presupposes product growth.
How to Create Data Models: Best Practices
To build accurate data models contributing to excellence in software development, we recommend considering the following criteria:
Naming conventions. To enhance the meaning and maintainability of a data model, it’s necessary to agree with a client on how to name objects. This will help avoid mistakes and confusion. Clear and standardized names for entities, attributes, and relationships make it easier for a client and developers to work with a data model. That’s why before starting data modeling, discuss the names of objects with your client.
Normalization. To improve data integrity and avoid data redundancy, it’s necessary to efficiently organize data in a database. That’s why a well-designed data model should follow normalization principles like business rules and constraints to ensure consistency. Normalization is needed to avoid undesirable characteristics in entities.
Scalability. A robust data model must be scalable to efficiently handle large volumes of data. What’s more, it should be able to deal with the increase in records and users without performance drops. To achieve this, it’s necessary to integrate various architecture types, as focusing on one type may not address customers’ diverse needs. If a data model is scalable, it won’t be a problem to modify it and add new attributes.
Performance. A data model should be high-performing to deliver fast and efficient results. To achieve this, it’s necessary to employ techniques like indexing, optimization, and other performance-enhancing techniques. These measures can guarantee that a data model operates within acceptable response times.
Meeting business needs. A data model should not only support the business requirements but align with a client’s vision and goals. So, it’s recommended to talk and team up with a client to better understand and respond to their needs.
Data integrity. Building data models requires data accuracy and integrity. That’s why it’s necessary to establish business rules and constraints within the model to ensure data quality and consistency. By following these rules, you’ll fortify your data integrity, creating the ground for accurate analysis and informed decision-making.
How to Streamline Data Modeling?
Data modeling is a complex and challenging process. And unfortunately, we can’t speed up it as much as we would like. And there are no tools which can perform it instead of us. However, there are some ways to assist in streamlining data modeling. Here is a list of my recommendations on how to streamline data modeling.
First, follow all stages of data modeling dividing it into small steps. As I already mentioned above, it’s crucial to start with conceptual data modeling and progress through the final stage of physical modeling. Maintain consistency and coherence in your steps and strategies.
Second, do not seek easy ways if you care about project quality, efficacy, and success. While it’s tempting to speed up the data modeling process, rushing can cause mistakes, confusion, and negative outcomes. Instead, progress from conceptualization through visualization to implementation, achieving excellence in software development.
Third, you can use tools and software that can foster the efficacy of data modeling. For example, Visual Studio can create a data model. But mind, please that this is a quick-and-dirty way. And in any case, you will need expert people. They can transform an automatically generated data model into one that is accurate and relevant for your client. No tool can create a final data model. They can be helpful instruments, especially when you already have a conceptual model.
All in all, conceptual, logical, and physical data models are important for a project’s success. Undervaluing the conceptual level of data modeling is a common and costly mistake. It defines and influences further mapping of data structure. The reason many changes and optimizations are needed before a release is that companies often skip conceptual data modeling.
At Forbytes, we offer our clients a structured and efficient approach to organizing datasets. This enables businesses to achieve better scalability, performance, data management, and integration through conceptual, logical, and physical modeling. Our data modeling expertise will help your business make better decisions, work more efficiently, and succeed.
Do you want to build your solution together with Forbytes? Get in touch with our team! We’re looking forward to connecting with you!