Data integration is a necessary part of many workflows, from customer data onboarding to payroll, but for many datasets, the process is time-consuming and manual. Data is siled into databases and SaaS applications, each holding information in different formats, making it difficult to move information from one database to another. Lume is using AI to change this.
Lume's system uses AI and algorithms to automate data mapping by extracting data from database silos and “normalizing” it. This means converting your data into a standardized format that can be easily moved and integrated into other workflows. If data integration gets interrupted in the process, which is a common problem, Lume will notify you and use AI to try to remediate the problem. Lume also provides an API and web platform so clients can incorporate Lume directly into their workflows.
What sets Loom apart from other data mapping tools is that it focuses on complex nested data formats like JSON, rather than extracting data from spreadsheets or PDFs. Lume co-founder and CEO Nicolas Machado told TechCrunch that Lume helps companies map complex arithmetic, classification, and text manipulation tasks. He added that this focus allows companies to save time and money over outsourcing these data projects.
“One of the core problems we saw is that moving data seamlessly, truly seamlessly moving it between systems, is a completely manual process, and has been for literally 60 years. ,” Machado said. “Why can't we automate this? Why hasn't it been possible before? Because data is unique to each system. Each company, each vendor, each integration defines data in its own way. They structure the data in their own way. They understand the data in a different way.
Machado and co-founders Robert Ross and Nebyou Zewde are no strangers to this problem. The three met as freshmen at Stanford University, studying computer science with a focus on AI. From there, they went on to work at technology companies ranging from Apple to OpenDoor, all working on data integration projects. In 2022, the founders saw the writing on the wall regarding advances in AI and decided it was the perfect time to solve this data integration problem.
“Every engineer has faced this problem at some point,” Machado said. “Every engineer has to do that. So we started, got together, literally went to one of my co-founder Robert's apartments, and worked through the night.”
Lume was founded in January 2023, launched its first product in March 2023, and passed Y Combinator's W23 batch. Machado said the company has seen strong inbound demand since then and has gained dozens of customers so far. Machado said Lume's customers range in size from startups to Fortune 500 companies, but declined to provide further details.
Lume recently raised $4.2 million in a seed round led by General Catalyst with participation from Khosla Ventures, Floodgate, and Y Combinator, in addition to angel investors.
“They really understand this issue, and that's why they got into it,” Machado said. “That's why they're so excited about this. They're operators. They're like, 'Wait, I was a CEO 30 years ago and this was a problem.' Is this still a problem? That's insane. ”
Machado said the round will be used for hiring, the company hopes to double its workforce from five to 10 by early next year, and continued efforts to develop its technology.
Lume isn't the only company trying to solve data integration problems. SnapLogic is a company that has raised $371 million in venture funding. Osmos is another startup looking to help businesses do this. As engineers continue to work on this problem, competition is likely to become even more intense. However, Machado isn't worried about competition because he believes Lume's algorithms and how its API integrates Lume into a company's existing workflows will help differentiate Lume. said.
In the future, we hope that Lume will be the glue that sits between two data systems and can seamlessly facilitate data flow between them.
“We all love data and believe strongly in its importance,” Machado said. “The metaphor we use is that, historically, to extract value you have to process it and use it to power machines and everything else, like oil. That is what the data is and what it has been.”