See how some of our clients solved their business challenges with AI and machine learning.
We created an algorithm that tests Facebook Ads retargeting campaigns based on different variables and analyzes past data to boost conversions and sales from Facebook while decreasing campaign costs.
Problem: Marketers needed a tool for creating effective Facebook Ad campaigns and optimizing their performance based on data, not just a hunch.
We used machine learning to assign attribution across marketing channels and allow marketers to connect data from various sources. This helped them better understand their audience and the customer journey from click to conversion. All in a comprehensive marketing data dashboard that collects real-time data and gives valuable insights into what’s driving campaign results and delivering ROI.
Problem: Data across marketing channels was too distributed. Teams needes a tool to compare insights in one place: to evaluate conversion while still accounting for the cost of each channel.
The algorithm shows which actions impact sales the most, where clients can cut costs, and how a budget increase can improve performance, while highlighting new opportunities in Google Analytics.
We created a continuously trained machine learning algorithm that can recognize brand logos in images, along with a tool generating synthetic data necessary to train it. The system recognizes brand logos in any online image with 60% accuracy, making it the most effective algorithm of its kind.
Problem: Marketers needed a tool that could automate real-time online monitoring of brand logos in images posted on social media. Automated brand logo recognition helps brands detect additional branding opportunities, react to potential issues and crises much more accurately, and protect brand identity.
Step one: Generating synthetic data
Step two: Training the algorithm
Step three: Building a verification tool
We used advanced machine learning algorithms to create a tool that automated invoice classification, reduced manual work, and improved the accuracy of cost tracking by accounting departments.
Problem: Our client had to classify over 20,000 invoices manually, primarily to help with accounting and support with decision-making. Processing the invoices involved a lot of manual work that could be saved by a smart, automated solution.
Here you can see how we've done this .
Step one: Removing manual work
Step two: Simplifying the process
Step three: Leveraging the new system
We created a tool that automated metadata extraction like directors, actors, year or genre from TV show descriptions. We used a hybrid extraction method based on regular expressions, linguistic rules, and statistical language models – using dependency analysis for the decomposition of templates.
Problem: Our client needed a way to detect metadata in program descriptions and move it to a new database. The process was laborious and needed to be automated to save time and manual effort.
Step one: Cleaning the metadata
Step two: Creating a pipeline for metadata extraction
Step three: Template analysis
We used the latest machine learning technologies and neural networks to build an intuitive, comprehensive application for consumers and businesses that helps automate tax returns and save time in large organizations. The automation guarantees a streamlined tax return process in any company, extracting data from a tax card in just 13 seconds, with 93% efficiency.
Problem: There was no way to settle tax returns easily and quickly based on the received tax cards. The process was manual, taking up hours instead of the seconds it could have been taking with smart AI technology.
To simplify and shorten the application process, the system only selects relevant profiles. At first, the user answers simple questions. Based on the information, the application generates a form that’s tailored to the customer.
The tax card scanning feature using a mobile device saves several minutes of processing time. In just 13 seconds, the system extracts data with around 93% efficiency (for all fields) and inserts it into the appropriate fields of the form. This speeds up the tax return process, while the intuitive user interface makes it easy to follow.
The app also includes a tax calculator, which converts the user’s refund amount or the necessary additional payment. Its value is recognized by the client shortly after the first data has been entered, and it is updated as the next steps are completed.
The app is available in two language versions: English and Polish. Switching between the two languages is seamless and doesn’t require reloading the page.
Additional solutions supporting the work of business clients save time in large organizations. Based on the uploaded file with tax cards, the system creates a list with new tax returns. It guides the user through each application, one after another.
Business clients also receive a tailored offer, defined by the application packages. The user can select the currency for their tax applications using two different payment platforms.
For one of our clients, we created a system that monitors online advertising, tracks page views and clicks, and compares them to the data provided by external marketing agencies.
Problem: The client wanted to be able to easily verify the number of online ad views and clicks from ad servers with their own results. Ad servers use their own statistics, which often results in inconsistent and inaccurate data, affecting the decision-making process in marketing departments.
We used data mining to equip the system with useful features. Our engine can receive large amounts of data, then aggregate and process it. It also generates reports and shows data in real time. All this is accompanied by Google and Onet certifications concerning traffic requirements.
After defining the tests, we conducted trials of each technology to find the one that met all the requirements and saved our team a considerable amount of development work.
We created an algorithm that targets users on Facebook based on their psychological features and suggests how to create a Facebook ad that gets clicked. The system allows tracking ads/users/pixels and generating creative briefs for graphic designers based on keywords and the features of the target group.
Problem: Our client was experiencing low CTRs for their visual content on Facebook. One of the reasons was that the content wasn’t precisely targeted to the specific audience groups, which caused wasted ad money and ineffective campaigns.
Using a short description of the campaign, primary demographic data (gender, age, city size, etc.) and keywords describing the target group, the system divides the targeted group into sub-groups with a specific psychological profile and aesthetic preferences.
The generated creative brief helps to create a visual ad that matches the group and guarantees increased conversion.
The algorithm was created by psychologists, developers, and the data science team and it took six months to develop. Because of cultural differences, the system is currently focused exclusively on the Polish market.
For a European college in Cracow, we created an algorithm analyzing the profitability of degree courses. The system is an integral part of the university’s business intelligence, allowing the school to verify the quality of education based on the graduates’ work achievements and enables the creation of business plans based on reliable forecasts. All while reducing the cost of manual analysis of large data volumes.
Problem: To determine whether a course was profitable, the school’s employees had to analyze large data sets, which was time-consuming, inefficient, and prone to human error. The data was diffused and included in numerous SQL databases, but also raw sources like text files, pdf files, e-documents, and spreadsheets.
The goal of the project was to create a tool for integrating large and dispersed data sets. The time needed for the digital analysis was cut down to just a fraction of a second, instead of many weeks of the administration staff’s work.
One of the most important goals of the project was to make the system intuitive and easy to use for all university employees. As a result, it gives more accurate analysis results (in real time), reducing the investment in time and money.
We created a mobile app that uses personal data to simplify the self-management of diabetes: collecting information from smart devices and medical equipment to present a holistic view of how to optimize your routine each day.
Problem: It’s estimated that 415 million people in the world have diabetes. The predicted increase in this number is estimated at approximately 642 million by 2040. One person (five million a year) dies every six seconds because of diabetes – more than HIV, tuberculosis, and malaria combined.
One of the 415 million patients is Przemysław Majewski, CEO of DLabs, who wanted to help other people with diabetes manage their illness, plan their diet, and create training plans.
The technology created by the DLabs team uses artificial intelligence to personalize diabetic therapy and help patients in their day-to-day management of this chronic disease. This solution will benefit people who have type 1 diabetes, because it requires intensive treatment and exceptional care, in particular during sports activities.
In the next step, we intend to adopt the solution to the needs of patients with type 2 diabetes and consider other groups of customers.
The goal of the project is to retrain the market, focusing on investments to prevent complications resulting from diabetes, which will avoid costly treatment and the risk of severe consequences.
Suguard is intended to help patients achieve stable blood glucose levels. In particular, it can help those patients who lead an active lifestyle, to improve their health and well-being, and to reduce the cost of treatment.
A handful of clients use Suguard daily. It offers 90% efficiency in predicting actions to stabilize blood glucose levels, decreasing the risk of complications while helping users choose the right foods, activities, and medicines.
The DLabs Chief Data Science Officer participated in the creation of modern decision-supporting tools and recommendations for the commanders of the State Fire Service during rescue and firefighting operations. At the heart of the system are activity detection algorithms and counting navigation using the IMU (inertial measurement unit).
Problem: Making strategic decisions in crises under time pressure is extremely difficult. The fire brigade needed a system that improves the effectiveness of rescue operations by supporting decision-makers with real-time data analysis.
The goal of this project was to improve the safety of firefighters and to optimize the results of rescue operations. The core of the project was to design a modern risk management system that would help the commander in action.
The amount of information, time pressure, and unimaginable responsibility can effectively hinder decision-making in a given situation. Data enables us to establish a productive interaction between man and machine, making the work of firefighters much easier in high-risk actions.
Finding solutions requires excellent knowledge of the complex algorithms of dynamic risk management. The system both assesses and forecasts risk factors in real time and assists the commander by recommending the best scenarios for a given situation. The scenarios are based on models created from computer simulations and historical data.
One of the most critical issues was finding the right algorithms for the following problems:
For our data scientist, the most interesting task was to create a version of the Kalman filter that uses data from gyroscopes and accelerometers, which improved the trajectory of firefighters.
We created a system that predicted changes in stock market trends for an international hedge fund.
Problem: The client wanted to be able to predict stock market trends and create best trading strategies on currency markets.
Step one: Quantitative analysis
We collected, aggregated, and analyzed historical market data for transactions. An essential element was generating statistics and reports from big data sets.
Step two: Creating algorithms
During the project, it was necessary to build and analyze various statistical models and analyze historical market data using data mining techniques.