The importance of business value tools in medium and large-scale E-Commerce projects

Rudi Rocha
11 min readSep 17, 2020
image from: https://unsplash.com/photos/lRoX0shwjUQ

I wrote this article some years ago (2015) when I was working at VentureOak. Today I was having a brainstorming session and this topic came out! I thought, let’s bring it here and share it with my followers.

Big e-commerce projects can bring big problems to business value, especially related with databases. With the growth of an e-commerce project it is certain that you will have a database and, apart of technical point of view, it’s important that the reality of the business matches with the reality stored in the database.

Bugs and features can create in our “business reality” some incorrect “state machines”, which until discovered, will provide, to reading processes, a wrong view about business. So, since the first time that a functionality started to save bad data in the database, your project will work with that information.

After getting some “business data” problems, project team has to proceed with its fix. That is possible by solving problem at root cause, but team can’t forget that database needs to be fixed too accordingly the reality. This database tasks can be done securely with routines fixing each record of each affected entity, or even directly at database file assuming at our own the risk. At the end both ways are a “data fix”.

In a short term both presented solutions can be acceptable, but not at medium or large term. Business value of a project is directly threatened by inconsistent databases because the stored reality is not reliable. Also, it’s too expensive to have a unique person monitoring all databases, searching for this kind of problems at the database and reporting wrong situations.

This problem made me think about a way to help development team in a business value point of view. The solution must be around a tool, which apply validations around resulting data and creates readable files notifying all bad cases for all functionalities of our project. I’ll investigate a several possibilities and explain which can be the best to solve this threat to business value.

The main motivation to think about a big problem.

As a developer working on a large e-commerce project, I saw a several incoming requests from management layer asking to fix different database situations. User mistakes or even bugs messed up with database, and worst, some data were broken for a period which it was forced to create a big fixing task. What if we had anything that validates business data tell us when something went wrong?

Databases as a threat to the business value of a project

When working with e-commerce projects, we need to keep stored large amount of data. All resulting information of project functionalities are used to multiple kind of processes as read, update and delete. At medium and large term, database problems and their fixes can be a hard way to decrease the project trust level, making new generated information based on a wrong reality of business. Even if the database is corrected, the spent effort on fixing could not be enough to prevent new cases, or even, to prevent project financial injury.

How to protect Business Value from Project IT Team point of view

The Business Value of a project consists of numerous factors that influence it, either directly or indirectly.

Technical factors such as scalability, used technologies and the production process are in a direct way, the impulsion channel of the application business value. Through them, the customer evaluates the application / product on its value and trust.

In the production process, regardless of the development teams, projects are provided with elements that ensure the successful integration of new features and bug fixes, including QA and “testers”. Whether in the development’s progress, either in controlled “staging” environment, there are manual and automatic tests searching failures both technical and/or business side. With this is not intended to exclude the use of unit testing techniques, which directly affect the outcome of programmers, and which have proven a huge advantage in the final product quality / functionality.

With regard to human factors, the team management, the planning of new features of the project and the treatment of application problems lead, in a roundabout way, to the project business value. The interaction between the various elements of the teams, regardless of their duties, is equally important. It is obvious that the higher the “know how” of each individual and the greater its interaction with the other, the better the analysis made by the managers. On the other hand, the more effective the integration planning for a given need, the greater the business value, which grows securely.

Considering an e-commerce project, logistics, management or others, when it assumes a medium or large scale, it is common that, after a certain period of time, it happens collisions between the actions of features, or even between the own business rules. Usually, the planning team along the customer take the necessary decisions to resolve the collision. The time required to solve the problem, since the analysis and decision making, to the development and implementation, can be enough to generate wrong information in existing databases, plus the impossibility of disable that application area as a way of prevention. In a situation more severe, the application database can even get wrong business logic for lack of knowledge of integrated development anomalies, until they are identified.

The process

Upon reporting a data error by an user, it is necessary to investigate the cause and fix the feature. At the same time, the client needs the information in an expected state machine, what it requires data correction tasks. The agent who performs these tasks at the database level, needs to evaluate all the implications and restrictions thereof, as well to build the fixing of technical instruction data. This process can become too complicated, time consuming and dangerous, especially the larger the size of the database in question and the number of affected entities, but also by the number of similar cases that were found during the whole process of analysis and correction. We can say that a single abnormal situation in the use of the application can trigger a lengthy and expensive process of correction, which may not be effective.

We are then in front of a clear threat to the application’s business value. The database represents the business reality from a virtual perspective, and whether it is misleading, data inconsistency problems could cause the project failure. Other complications can arise when we deal with inconsistent data: see the example of an e-commerce store that presents stock of non-existent products, physically, in the warehouse.

As a security measure, it can be used business intelligence elements. These elements, using data warehouses, allow the strategic management layer to be provided with treated and extracted information in readable formats of what is the reality of the business. The problem here is that the tasks are performed, mostly, in order to help increase the Business Value through reports and KPI indicators of how the application is returning the investment.

It becomes evident the need for an active monitoring of the generated results by the application. The data inconsistency, at the time of reading, whether for purposes of planning, whether for project consumption, directly puts into question the value that the business can achieve, under the penalty of imminent failure.

Short-term Solutions

Without the planning and analysis of a final implementation, the coordination of direct query the databases to find error conditions, remains with the management team. This task may not be efficient because it may not exist specific situations for validation, at the time. On the other side, it isn’t productive to create a database query per possibility of data manipulation. The team ends up staying dependent of bug reporting by the user.

After appearing some specific situations and, as already described, proceed with the identification of all possible cases, it comes the data fixing tasks. These tasks can be developed in a reusable way, through recurrent and localized procedures to a specific data range, or through unique routines for each case. It is up to project management layer to decide if this approach is the most appropriate. This decision may be based on a deep analysis of the process model and used database, to establish the current level of trust in the application. However, those actions should exist only on a short-term level. The allocation of a resource for this kind of tasks affects the development of new features, especially if they are related to the data errors found.

Another kind of action can be through the cooperation of B.I. teams. They provide reports tailored to each validation to be performed, which may even create a quantitative error estimations associated to its calculations. Again, in the medium and long term, this approach can be problematic, because it is subject to the whole process and workflow implemented in extraction techniques and data processing.

Suggested Solution

The proposed solution in the medium and long term is the use of a tool that automatically, accesses the database and validates the different states of stored entities. The tool should grow with the main application development, even if the development is with a higher maturity than its own. Additionally, this growth must have a constant collaboration between the development layer, QA layer and project management because the rules of a particular project are too specific and not to be applied to any and all projects. Then, it must be one “data integrity component”, ideally developed by the development teams of the main project. As the unit tests are created, tasks for validation by the data integrity component should also be designed and implemented, or adjusted according to changing processes and business rules.

Considering a development environment like “local — staging — production”, the tool structure might start with some basics:

  1. At “local”, the database should be always a recent snapshot: analyzing the generated information means that the database should have always the most current data. Thus, according to an appropriate schedule, there should be a copy of the database to a development environment (“local”), where the tasks can be performed regardless of the applied cost performance without the production environment be affected in the analysis operations. Considering we are dealing with an online store, it is not correct that order data is validated when users trigger the creation of more records.
  2. The data integrity component should not be “deployed” to “production” or even to “staging”: depending on the entire tree of elements and positions organization of a project, it makes sense that the component is available only in “local “even if shared with the entire IT team.
  3. The tool should contain a “user-friendly” GUI: the interaction with this tool should have a graphical interface. Any employee of the team should be able to run new checks and get their results.
  4. The tool should be multitasking: the tasks concurrency is a necessity, processing at the same time different user requests, data validation and result reports.

Upon getting done the basic principles, it will be the development team set the other needs to implement this data integrity component.

After all the analysis, requirements and needs are clear, the choice of technology will be required to support the application needs. I think the best step is from a development framework that provides the largest number of features, so the focus centers on the construction of the validation tool, not ignoring factors such as performance, cross-platform, interactive media and the language in which it is built. In the proposed solution were considered frameworks like Laravel (php), Node.js (javascript), Zend Framework2 (php) and Sails.js (javascript). Laravel framework was the choice, considering its performance (it is a very lightweight framework), the ability to interact with Queues (possibility of managing tasks performed in the background, either locally, or in connection with API Queue services) and the quite short learning curve.

The approach about the development of data integrity component started from the implementation of an “engine” of job management, successfully implemented, using the setup of a “Beanstalkd” queue system and Laravel Queue feature. This way it was possible to keep active processes on a webserver that are listening for new instructions created by the use of the tool.

After finishing the implementation of basic communication between the application and the queuing system, it was implemented the jobs structure that asynchronously, query the database and processes the information in accordance with procedures defined according to business logic. Giving no details about the technical implementation, that actually focus on the entire structure of classes and design patterns used to implement, the tool proved to be able to gather and analyze, with the databases, records that should be analyzed (by selecting a specific type of job and receiving the respective time interval attributes from user inputs), and therefore apply all the necessary validation rules defined in configuration files.

Up to this point, the project is provided with a tool that validates information, denouncing all the wrong cases and likely the reason for the respective “state machine” that define it wrong. Considering an example of a logistics management system, in the case of an order tagged as delivered, but that the sold item is stored in the information system as “in stock”, this error will be recorded.

Pros and cons of proposed solution

After the implementation of this tool, the results were noticed as early as the implementation of the first tasks: validations performed to different business entities denounced the wrong and warning cases that have been reviewed each one by project’s operational layer. For each case reported, were found, with great precision, which the application area was generating bad data and as well as applied patches to the same data, with great brevity. To make use of the validations in daily intervals, the data are corrected at very short notice and therefore the Business Value is not compromised or generate loss. If the project is an e-commerce business, the state machine between the order and the stock becomes vital to the Business Value of the project, since financial reporting, inventory balances, supplier purchase orders , are based on this same state machine. Thus, for each order corresponding to a not correct stock status, in a few moments you can repair the status or even resort to operational staff to physically check the actual status of stock.

However, there were negative aspects in the implementation. First, and with greater impact, with the implementation of the validations in multiple databases, the need for storage has become a problem. To accomplish the basic principle of maintaining a copy of a local environment (even a server on the IT network side), when analysed various databases and also the fact that its size increased at a great rate, it became necessary investment in support equipment.

Another point to highlight is the availability of the IT team to allocate resources for maintaining analysis tool. The technical aspects at infrastructure level, as well as the business value aspects regarding the maintenance of validation rules, and the creation of new validation tasks, can become costly for the project.

Conclusion

The project value is determined by all components adjacent to itself. The contribution of all staff members and all the tools available, make this value vary.

The possibility of developing a tool, under the Business Value scope, is reflected in a direct view of reality that is the operational system, and how it behaves. Being able to find the flaws, getting alerts, or even create sufficient maturity for automatic resolution of these conflicts, helps a lot in the reduction of threats that, in an extreme scenario, can cause injury at many levels to the business.

When the concern of the whole team is focused on the development and implementation of new features and the resolution of performance issues and errors, the concern for the business value can be lost. The technical vision of a project must also be proportional to the business value vision, not forgetting that should be shared and accepted the importance of each action by all project so after that it will grow itself and grow its Business Value.

--

--

Rudi Rocha

Software Engineering Manager and Software Engineer | Server Side Trainer | Human stuff as a hobby.