EP 164 - How to collaborate over real-time operational data - Krzysztof Malicki, Founder & CEO, TrustedTwin
|Feb 17, 2023|
This week, our guest is Krzysztof Malicki, founder and CEO of TrustedTwin. TrustedTwin is an API-first solution for sharing real-time operation data with multiple partners powered by digital twins to build processes and enable services.
In this episode, we discussed the technical and business challenges in collaborating with operational data, including ownership, access, and ledger tracking. We also talked about the importance of data sharing to enable new solutions in business models like device services and data federation for platform plays.
● What are the essential elements of sharing operation data with your partners?
● What are the functionalities of digital twin technology?
● What are the two main types of customers that are using the platform?
● What are the technical and legal challenges in operational data?
Erik: Krzysztof, thank you for joining us on the podcast today.
Krzysztof: A pleasure for me too. Thanks a lot for that possibility.
Erik: Yeah, I know. I'm looking forward to this in part. Because as we're discussing upfront, you're hailing from my home country — or my grandfather's home country, I should say — of Poland. So, tell me, what is the tech ecosystem look like in Poland today?
Krzysztof: I think that Poland is very focused on IT, really. There is a lot of large companies having offices, even R&D here like Intel, like Amazon. Actually, I am even from the Gdańsk. That was exactly the city where your grandfather was living. Even in Gdańsk, we have Intel. We have Amazon. We have a lot of international IT companies. From a technological point of view, a lot of this is going on in Poland.
Erik: Yeah, I've heard that. I've heard a lot of talent there, and a lot of companies are obviously setting up their subsidiaries for doing work. But then, obviously, entrepreneurs are starting to be spun out of those. Of course, you're one of these entrepreneurs, but you have quite a career in the IT sector. It sounds like you were primarily working with more mature businesses previously. What was it that triggered you? Well, I guess you've had your own business for quite some time now, this Aiton Caldwell. But I think that's not I would call a tech startup, as I understand. What is it that triggered you now to set up this business, Trusted Twin?
Krzysztof: Actually, Aiton Caldwell was a tech startup. It was, in fact, the first voice over IP company in Poland. Now it is still the largest one. We're not focused on only delivering the service. In fact, we build ourself entire technology stack. So, it was quite a large cloud-based system partially for customers, partially for businesses. It was a technological startup with a lot of new technological solutions. Those was 15 years ago when we started.
In 2011, we introduced Aiton Caldwell towards stock exchange. After that, I decided to work a little bit for the university in Gdańsk. After almost 10 years of working for the university, I decided to try once again. This time, my now subject was data. In fact, data collaboration. So, that's what the new start up is about.
Erik: Okay. Well, congratulations on going public. I had to notice that. I guess that gives you a little bit of flexibility also in terms of where you devote yourself going forward. Trusted Twin, what is it around this topic? Obviously, there's 100 different topics or areas that you could be devoting your energy to today. So, what is it around this particular topic that incentivized you or drove you to focus?
Krzysztof: Okay. Of course, it is somehow related to Digital Twins. But Digital Twin, as such, is a very broad topic so I will return to that in the future. But I would like to start that Trusted Twin is about data. Because we noticed that there are two main dimensions in which you can divide different data solutions. One is analytical data versus operational data where the focus until now was mostly on analytical data — usually, historical data usually organized around tables or datasets, and usually used for making better business decisions or optimization.
But on the other hand, we can have operational data which is, in fact, more related to IoT when we're talking about data which is used for execution of processes. So, it has to be real-time. In fact, it has to be organized rather as objects, as processes are usually built on top of objects. It's to say, it is for execution, not for optimization. That's the first dimension.
The second dimension is not sharing data versus collaborating on data. We also see that more and more companies see that the value in data they already have is in collaborating with the data, on top of that data with the others. So, we are on the clash of operational data and data sharing or data collaboration as collaboration is usually built on top of objects. That's why we are using the concept of an operational Digital Twin, which is not for simulation but which is an object on top of which you can build your business process, for example.
Erik: Okay. Let me check that I understand this properly. I think a lot of people think of a Digital Twin, and you're thinking of maybe a Siemens' product used for simulation of a factory production line, something like this. But you're looking at digitalization or virtualization of data objects, and then enabling the sharing of those objects between companies. Is that the right way to think about this?
Krzysztof: Maybe not virtualization. Okay. Let's share one of the typical ICPs that we are dealing with. We are working, for example, for traditional companies that are offering a traditional product, which is probably usually let's say, for example, a meter, like a utility meter, which is a device, a physical device. But they want to convert from selling products into selling a digital service. In that case, it could be like a measurement as a service. Their customers don't want to buy meters anymore; they simply want to know how much water they used last day. The measurement becomes the service.
When going from a traditional product into a digital service, there is always that digital component that you have to add which is usually related to data. That's exactly what we're calling or where the operational Digital Twin can be used. Because then, you're converting your meter, a physical one, into its operational Digital Twin — a representation, a virtual representation of your physical meter that you can use to build that new digital service. A measurement as a service in that particular case.
Erik: Okay. Clear. Then you're capturing this data and, I suppose, the metadata around that so that can be effectively used. Then specifically, you are capturing this with the objective of helping companies to share data between them.
Krzysztof: Collaborate on data. Let's stay with that example. If you're a traditional company, you do not have the expertise in data — starting from building the infrastructure but also, for example, integrating with your customer systems. Your core business in that case is manufacturing meters. But there are things that are non-core but still mission-critical if you would like to implement that new service. This mission-critical, non-core requirements are related to scalability, availability, data retention, integrations, security, and so on. This is what we provide, so our clients can stay focused on what's core to them. But all those non-functional but still mission-critical issues or problems related to data are covered by our platform. That's the first one.
The second one is that they want to deal only with the measurement. But to build the ecosystem, you also need different types of data. For example, if you want to offer measurement as a service, you sometimes have to send an installer like a service company to install a meter, to replace battery, and so on. So, you need access to the address, which is provided by the building administrator in that case. But that meter manufacturing company, they don't want to become responsible for GDPR data. In fact, they do not need that data to simply provide the measurement. That's why we are talking about data collaboration and ownership management, which is called data governance.
Every partner in such an ecosystem provides the data by themselves, but they still own that data so they can simply set a rule when that data becomes accessible by the others. In that case, for the purpose of installation. But that meter manufacturing company, they don't want to store that data in their systems. That's why there is a need for operational Digital Twin, an object that exists in between of the collaborating partners. So, everyone can contribute data to it, and everyone can use that data at the same time without losing the ownership, and with the ability to manage the visibility and access to the data.
Erik: Okay. Interesting. So, you're enabling the collaboration around data where ownership and access can be controlled and withdrawn in real-time. I noticed the word blockchain a couple of times but, fortunately, not so many times. What I mean is, it looks like you're making use of blockchain as a technology, but you're not trying to promote it as SaaS because we're blockchain, which I think is a favorable trend. We're starting to see the more practical applications. Where does that fit in, and then to what extent? Is it an essential element of the tech stack? To what extent is it an optional element?
Krzysztof: In fact, part of the service that we can provide. Because when you collaborate with the partners and if this is like an ecosystem with equal partners, then you will probably, sooner or later, need a source of verifiability or a source of trust. Blockchain is a very good technology. We would like to provide such a source of verifiability in a distributed and democratized, let's say, way. That's why we are using Blockchain, but only to notarize events. If you would like something to be remembered forever just to have that ability — the ability to return to that in the future to verify what really happened — then we can use blockchain to notarize data or to notarize events. That's why we call that source of verifiability.
Erik: Okay. Clear. So, if you wanted to prove that I shared this data with you at this date, therefore, you had access to that data for whatever purpose. Blockchain would allow you to prove that that event occurred.
Krzysztof: You can cache a document, cache an event and store that in a blockchain just to have that ability to return to that particular, let's say, block in the future to check what really happened.
Erik: Maybe you can quickly walk us through the functionality. Let's say, again, the concept of a Digital Twin is, as you said, a very broad concept. So, it'd be useful to understand where specifically you're fitting in here. I think on your feature set page, you have a good breakdown of this in terms of batch operations, dock operations, et cetera. If that's the best way that you think to walk us through where you fit into the service stack.
Krzysztof: Okay. First, I will start with that. Today we have, let's say, two ends of the same problem. First, you have to gather data. If you're talking about IoT, that's a separate problem. Then if you have that data in your system, you have to monetize that data somehow, so you can build a process on top of that. You can sell that data by giving access. You can federate that data with the others to, for example, gain more data for big data analysis, and so on. We are working on that second part. Our service is best for situations where you need to collaborate on data. Of course, you can use it to gather data. But when talking about IoT, it will probably not be the best one. But if we're talking about something else, it will be also very good for gathering data.
When you have that data and you want to collaborate on them, we are providing, let's say, two layers. The first layer is data itself — ability to define data models and ability to store data. When you have data on the platform like a virtual representation of a meter, a virtual representation of even a shoe — this is one of the examples — a virtual representation of a customer or a client like a customer profile, then the second layer is what you can do with that data. We are providing ability for real-time data governance, so you can always decide which part of your data is visible to the others.
As we are talking about objects that are built from data coming from multiple partners, at the same time, each party can set those rules, what is visible to other parties using the same virtual, let's say, object or the same object. That's the first one. The second one is the ability to put into the platform a large amount of object without thinking about those non-functional requirements like scalability, availability, response time, and so on. So, we are providing scalable global infrastructure so you can stay focused on what you want to do with data, not how you should do that and how to prepare the infrastructure for that.
Then we provide a lot of different integration technologies that separates integration from data. For example, reduces the full graph of integrations into a star graph of integrations and also separates partners. No one can harm the other, let's say, partner in the ecosystem. Everyone has that independence in choosing the best technology and best mode for integration. We can offer REST APIs. We can offer, for example, MQTT as a protocol to use data. We are offering subscribe and notify mechanism, webhook invocation, for example. We are offering different ways to represent data stored as objects in a, let's say, tabular way. This is also possible. We are offering the ability to store historical data, any type of file like a voice file, or PDF, CSV file, and so on. You can manipulate the data, and you can use the data in the way which is best for you. We are offering a lot of functionalities, a lot of support building processes on top of that data, even workflow mechanism like support for workflow.
Erik: I think this would be a good time to discuss a bit the users of your data. You've already mentioned a couple of different industries. So, that would be one area. What are your typical customer profiles from industry perspective? Then also, who are the users who would be accessing the platform? Maybe first, from an industry perspective. I guess this is very, on the one hand, a fairly horizontal solution but always dealing with data. There's particular sensitivity. Then, of course, as a business, you have your own strategic focus in terms of segments. Who are you focusing on as customers today?
Krzysztof: As you notice, the platform is horizontal. So, you can have any object model on it. But today, we see the two main types of customers that are using our platform. First, our customers that are traditional businesses that want to, let's say, start their digital transformation journey. They want to convert usually a typical product into a digital service. As I said, we are working with a media meter manufacturing company that wants to convert from selling devices, meters, into selling a measurement as a service.
We are working with a professional shoe manufacturing company, international publicly listed, that wanted to add/convert shoe into an IoT device to unlock new business models. Again, they need that digital component. But this is not their core. They don't know how to handle them. They don't have a team for that, so they are creating a digital representation of each and every pair of shoes.
We have a company project where the company is manufacturing potential project devices for leak detection. Again, from their perspective, their core is detecting leaks, having AI or machine learning algorithm that detects leaks. But where they have to handle a lot of data coming from those devices, this is, again, mission critical, non-core. We have another company that is gathering information from electricity meters that wants to create a virtual power plant for customers just to aggregate ability to produce energy so the price is going to be better. Again, for them, core business is to aggregate, not a data. That's the first type of customers.
We have the second type of customers which are digital-first companies. Those companies usually think about creating profiles of something. These are usually their end customers. They need those profiles for two main reasons. One is master data record for multiple legacy systems, and the second reason is usually data federation. If I cooperate, for example, we have a system where there is one payment processor, probably the largest in the region, and several thousands of merchants. Each merchant is not big enough to benefit from big data analysis. For example, when you want to compare it to Amazon. But if they federate data without losing the ability to control the usage, they have dataset big enough to make recommendations at the level of Amazon, for example. That's the second case. It is probably less related to IoT, but this is also what we see is the best usage for our platform.
Erik: It's fascinating. Okay. There would be different usage behaviors here. In one case, with the IoT device, I guess the platform would own that data. Let's say, the solution provider. Although, then you might get into discussions if it's a B2B case with the operator owning the data.
Krzysztof: We do not own the data. We couldn't consider it now.
Erik: Not your platform. I mean, your customer. But then, you would have a case where somebody is providing a connected sensor, and they would own that data. Or their customer who would be the operator would own the data, but they would have a contract to access it so you can help them manage those contracts. Is the user always going to be either, let's say, the IoT device manufacturer who is establishing this new business case or the platform provider who is setting up this new business model, this data aggregation business model? Could it be an operator? Let's say, a manufacturer, who says, I have a bunch of sensors and I want to provide, I want to allow the OEMs that are selling me this to access them for maintenance purposes, but only for maintenance purposes, something like this.
Krzysztof: We have also other projects which relate, for example, to predictive maintenance. If you're producing quite a large device and if you, for example, would like to implement a new business model where you're not selling this large device but leasing it, then you need the information about usage. Predictive maintenance, another, let's say, use case that we are working on. There is a lot of—
Each time you work with data, especially if that data is generated somewhere outside of your factory or your office, then you need to transfer that data and to store that data somewhere. For example, we're working with a company. We're building POC for a company that is scanning cars from the outside. Again, they have a lot of robots somewhere in the field. They need that data to come to their operational data center. But they don't want to deal with problems with scalability, transfer, data retention. Then they are sharing those, let's say, images with their customers. Again, sharing, collaborating on data, storing data, everything which is non-core, mission-critical is covered by us.
Erik: Help me understand some areas related to at least perception of risk or potential. Maybe it's real risk, maybe it's perception. I would imagine that there would be two concerns. One concern is, in theory, I own the master data and I'm providing access to X company to use it. But how can I prevent them from copying that data? I therefore lose control of it, and I'm not able to then restrict access in the future. I have lost control of that data set. That would be a more technical challenge.
Then there would be the legal challenge which would basically be a lawyer putting his hand up and saying, "I don't understand this. Let's not do it, because I don't understand it." There could be all sorts of concerns there. But I think, generally, a lot of companies are in a situation where if senior management and the legal team can't understand something, it can still be hard for them to move forward. How do you look at those two topics today?
Krzysztof: The first one is that there's no way to prevent you from storing the data that you see. You can even film your own TV when watching a movie. No one can prevent you from that. But when we're talking about real-time processes based on data, then if you will lose the ability to use the data in real-time, then you will just have a copy data from the last year. That's possible. Someone can always make a picture of your car, but it doesn't mean that she or he will use your car. It's just a historical data, which becomes less important when we're talking about operational data — the data which is used daily to run the process.
If you will kick off somebody from the ecosystem, then, of course, he can store part of the historical data. But that's all. Of course, the only way to secure that is to sign an agreement, and then to work with your lawyers to check if he deleted all the data. This is a typical situation of an NDA. I am providing you with the data, but I have to promise that when we stop our cooperation, you will delete or return the data. Of course, you can always try to cheat. But if not, there are lawyers for that. What was the second one?
Erik: Well, the second was just the more general headache of — especially when you're talking about new business models and so forth, to some extent, the legal team and maybe the senior management team, they might not have the technical competence to understand how this works and what the real risks are. But if they have concerns, they might be a bottleneck despite.
Krzysztof: That's why we have to provide certificates. Because this is not like someone will trust us because we are trusted. We always have to have that compliance with certificates. That's why what we're, for example, investing a lot in now is to have SOC certificate or ISO 27001 — certificates which are related to security. Of course, we are always trying not to reinvent the wheel.
The base infrastructure for our service is AWS or Azure. We are trying to inherit as much as possible from their services and to also base on the security they are providing. You can consider our service in a way that, of course, you can build your own car from the parts. This is Azure or AWS. Or you can use ready-to-use car which is best for the service you need to offer. That's us. But we are only certified parts, and we certify the car itself.
Erik: Yeah, clear. If I look at your pricing model, it also reflects the IS model in terms of you have fairly transparent tiers. Then you're paying for ledgers. You're paying for API events. You're paying for storage and transfer.
Krzysztof: That's the only four factors. Just to give you an estimate. When we're talking about the project with the meter manufacturing company, they are planning to have around a million meters on our platform — each meter storing a measurement once a day. When calculating all the events, ledgers, storage, and so on, the cost was less than €0.01 per month per meter just to handle the entire infrastructure, to handle integrations, to handle data, and so on. From their perspective, it was totally fine.
In fact, the total cost of the platform was less than the cost of a single senior engineer. That was how they compare us if we can build now. Because we will need at least two to three people, half a year to even start thinking about the platform like that, provided we have expertise and we know how to build it. On the other hand, we can have everything we need which covers those mission-critical issues related to the new service for the price of a single software engineer per month.
Erik: Okay. Clear. The value proposition here, basically, is that you are allowing somebody to use the infrastructure as a service and not the bare bones infrastructure. But really, infrastructure that's been tailored, designed for these particular purposes.
Krzysztof: If you need to model some kind of objects on the platform that could represent a large machinery like a car or a device or something smaller like a meter, and you don't want to deal with the infrastructure but you exactly know the model and you need to build a process on top of that object when you share or collaborate on data with your partners in an ecosystem, then we're providing everything you need.
You can start testing your model from day one without an initial investment. That's why what we're saying is that, let's start from a free sample. Because we are offering something like that. So, you can build your POC, proof of concept, in a month to check if your service will work, to check your service with the end customers, and so on. That's also what we're offering: the ability to build your service fast without bothering about the initial infrastructure, initial investment, expertise that is required. Focus on what's core for you, and focus on what's functional for you.
Erik: The users here, I guess, this is going to be the IT team who's going to be using the solution. Any particular skill sets that are required in order to make effective use, or anybody that can use Azure would be comfortable here?
Krzysztof: In fact, we are working in two modes. One mode is when a company, in fact the user of the platform which is API-first platform. This is not the platform for the end customer. This is something that you can use as part of your technology stack to build on top of it. We have two cases. One case is when a company already have a small IT team which has the ability to use APIs like REST APIs, MQTT APIs. They don't have to have expertise in cloud and data. They simply have to have the expertise which is related to the way they usually work. So, that's the first situation.
The second situation is when there is an integrator, the external integrator, that is providing a solution based on our platform for the end customer. That's a situation where the end customer does not have their internal IT team. But to make it easier, we are offering a lot of open-source libraries that can call our API even if somebody does not want to work directly with that API. We have a library for Python, for Java Script, for C#, for Java, and for all those popular languages. So, you can code in the language you like, and you can use the technology you like. If you like to use pull, push type of integration, then use it. If you would like to base on webhooks, then use it. If you would like to use technology like MQTT, then, okay, use it.
Erik: Okay. Some of the cases that you've already referenced sound like they have fairly sophisticated business architectures around, especially the second bucket of activities that's related to aggregating data from multiple companies in packaging and defining business models around that. Do you get involve in more of a consultative manner in terms of helping companies to develop this? What does that look like?
Krzysztof: Yes, because the most, if you would like to collaborate on data — this is not a simple case like to create a virtual representation of a meter, which is relatively simple. Because when you're talking about a meter, then you need probably error logs plus the measurement. That's all. But when you're talking about master data record, about federating data and collaboration, then of course, we're providing consultancy — how to model data, how to set those visibility rules. Because this is the expertise that we have and this is what we do, especially at the initial phase of the project. We're usually working with the client on the level of POC so the data is modeled in the right way, and just to teach them how to model data in the best way that will unlock the potential of the data, especially when talking about collaboration with the address. So, we do that.
Erik: Yeah, exactly. I think there's a lot of business complexity in those models that has to be addressed.
Krzysztof: That's exactly the point. The complexity is not in the technology because this is a simple call to the API. The complexity is how to design your business and how to design your data model, so you could unlock the potential of data you would like to use or to collaborate on.
Erik: Okay. Great. Then I see you also have this Trusted Twin Academy. I guess, part of the goal here probably is to educate your customers around this. What is the academy?
Krzysztof: This is because the platform is dedicated for developers. So, we have the technical documentation which is highly technical — a lot of APIs, numbers, calls, and so on. This is for developers. That's why we're discussing the problem is in the business, in the modeling of data. Academy is a less technical documentation which is focused on providing the user with the concept that we are trying to use on the Trusted Twin platform. What is an operational Digital Twin? How you can work with, for example, different identities? How, for example, you can anonymize data? How to use subscribe notify mechanism? How to use indexes, time series, and all the tools that you have available on the platform? Once again, it's less technical, more conceptual documentation for users that don't want to dive deeply into how to implement but want to know how to design.
Erik: Okay. If, let's say, I am building as a service business around an IoT device, and then I have a large array of customers who are using this, and I'm sharing data with them through the platform, would all of those users be directly accessing the platform? Or, would they be receiving the data in some other system but not necessarily interacting on your platform directly?
Krzysztof: It depends. But the main idea is to allow users, especially those you want to cooperate or collaborate with, to access the data directly via our platform without proxying the data through other systems. That's the reason. Only in that case, we can provide scalability, availability, and so on. Because otherwise, if you want to route something for your platform, then again, you will have to provide bandwidth, computational power, and so on.
The platform is designed in a way that everyone can interact directly with the platform, could choose the integration method, technique, and technology without asking. Only the governance is covered. But the governance is there. This is what you can set. I can leave the freedom for you to integrate the way you want. But on my side, there will always be the ability to manage what you will be able to see on the platform. This is what I mean.
Erik: Okay. Clear. It sounds like a great solution. We're a consultancy at IoT ONE. We're constantly dealing with challenges where companies are trying to figure out how do I monetize data in some way. I think, to your point, the technology issues are addressable. But you basically need an integrated solution that allows you to solve the technology but also allows you to understand how to interface that into your business.
I think you have a few good examples here. You've already mentioned a couple, but it'd be great if you could choose one and give us a bit more of a walkthrough on why did they need this, how did they integrate your solution. Then if you're able to share some of the outcomes, what do those look like?
Krzysztof: Maybe, let's return to that example I've been talking about because this is related also to IoT. So, we have that meter manufacturing company. They do not have IT expertise related to the cloud. They have their internal subsidiary, because this is also a publicly-listed company, which is responsible for embedded software — people which know how to call REST API, which know how to work with MQTT but who do not know how to build a cloud infrastructure that will deal with millions of meters sending measurements, for example, in a minute. That will be a problem. They want to be responsible for putting measurement into the platform. That's what they want to provide. I have a meter. I will send my installer to install the meter somewhere. Then I could guarantee that the measurement will be stored in the meter's operational Digital Twin. So, this is what they work.
On the other hand, they are providing that service for thousands of building administrators which can have different IT systems — some of them built 10 years ago, some of them just bought, some of them lease. They don't want to be involved in those integrations. They simply are saying that, if you know the, let's say, serial number of a meter, then you will be able to access that measurement via Trusted Twin platform. But it's your responsibility to integrate.
Again, if you would like to build something more on top of that meter — for example, you can add to the meter's object an address and the name of the user. You can add their information for balancing. For example, how much water flow into and out from the system. But this is not the data from the perspective of that meter manufacturer that I want to deal with, because this is not my data. I don't want to store your GDPR-related data like the address or the name. This is your data. I don't want to see it, but you might need it as part of that object information.
But when I am sending my installer to, for example, replace the battery, then I will ask you to grant access to that data, the address, for the purpose of that installation. I don't want to see it, but please grant access to my installer. Then the installer will add data which is related to, for example, the initial measurement which is shown in the meter. Because meter is simply adding something. So, I need to know that initial number. But this is data visible to me. The meter is, for example, generating error logs. This error logs should not be visible to the end customer. They should only be visible to the IT company that is part of my holding that is manufacturing the software. They don't want to be connected to the meter 24/7. That's why they're using our platform, to simply download those logs once a day while the platform is always operational. Again, availability, scalability, and so on.
Another problem which is related to data. The meters are sending measurement few minutes after midnight, so you have almost no traffic during the day. I'm talking about the measured traffic. But then you have 1 million measurements in five minutes to be handled by the platform. We are providing such a scalability. This is, again, something that they don't have to worry about even if they have millions of meters.
As you can see, for them, the solution is very simple. They are creating a virtual meter for each device. They are storing the measurement. They are saying to their customers, if you want to integrate, this is the Trusted Twin Academy. This is Trusted Twin documentation. They will help you integrate. You can read the measurement any number of times you want because we are separated. We are not afraid that you will do something that will harm our internal services or our internal software because we are separate. Everything is through the platform.
Again, our core business is making meters. We would like to offer a digital service. Part of that service is publishing a measurement. That is all what we do. The rest is covered by Trusted Twin. We pay less than €0.01 per month for that and don't want to worry what will happen. Because this is the service that we use. Then the transition from selling devices into selling a measurement service was very easy to them. They were able to start offering this new service in three months, when most of the work was related to the meters just to send measurements to the cloud, and so on. No initial investment, very short period to prepare a POC, and very fast ROI. That was, from the business perspective, what they received.
I can share a few more examples. For example, we are very good for startups. Because startups usually do not have time to design and build a proper cloud infrastructure. Using our platform is inexpensive because they can start from a free sandbox. We'll make them be secure, scalable, and provide availability just from day one. They can stay, again, focus on what they want to achieve from the functional perspective of their business.
Erik: It sounds like already a very useful solution for anybody that wants to especially build a new business model around sharing data or collaborating on data. You're also still a relatively young company. If you look at the product development roadmap over the next 12 or 24 months, what are the developments that you're most excited about?
Krzysztof: Around two months ago, we closed functional development. Now, we said, okay, the product is — we are different, the functional perspective. The new functionality will be added upon the request from the end customer. If there is a need, which can be generalized, then we will add that functionality. Today we are focused on going multi-cloud. Because we are based mostly on AWS, we would like to be also available on Azure. So, that's the first priority.
The second priority is to build compliance. As I said, this is SOC or ISO. As many compliances, as many standards as possible just to click for especially enterprises that will need those standards to be implemented. For example, if they want to share GDPR data, financial data or something like that. These are the two main areas of development which we are focused on today.
Erik: This is a little bit of a tangent. This is probably outside of your priorities right now. You might be aware that I'm sitting here in China. In Q1, there's a data security law that's been floating around for a few years now. It's going to start being enacted. Then that means that everybody right now is scrambling to figure out what data do we have to store in China, what data can we move outside and, basically, how do we access.
I had a company that is a construction materials company out of Germany. They have a fairly large operations here in China as well. They had a smart sensor solution that they wanted to enter into the Chinese market, but they don't know. Okay. We're collecting this data from devices in the Chinese market, but we still need to be able to analyze it back in Germany. That's where all of our expertise is. Basically, how do we do that?
I'm wondering whether — I think India is starting to have some similar framework. So, this might be a bit of a global trend of the government saying we want, especially any kind of PPI data or any data that might be related to government infrastructure, if it's sensors that are going into buildings or something, it might need to be stored in the country. Do you see this as a use case for managing data?
Krzysztof: Exactly. It is even worse because — have you ever heard about Data Act, which is going to be announced probably this year or the beginning of the next one and which is going to be implemented in 2024? It is, let's say, a tsunami similar to GDPR but related to IoT data. So, I don't want to jump into that. But what we also provide is compliance. Because most of the companies are thinking about their core business. They don't want to care with compliance until they must. But we are thinking about that. We are in all of those bad bodies which are responsible for regulations. That's why we can also guarantee that when those regulations will come, our platform will be compliant with those regulations.
This is exactly what is going to happen. So, you need to have the ability to decide where your data is stored. We can provide that. If you want your data to be stored in Ireland or in Germany, then we can do that. All your operational Digital Twins will be stored wherever you want them to be stored. We'll be ready for all those regulations that will come, especially in Europe. Because Europe will probably be the first that will implement this type of regulation. So, please look into the Data Act, because this is something that will come soon. That's exactly related to IoT.
Erik: Interesting. You're going to have potentially the situation where companies are building solutions that need to be assessing data from around the world — if it's a supply chain solution or anything like this — but the data has to be stored locally. Okay. Well, this could be a good business opportunity. At least, regulation creates opportunity for some entrepreneurs.
Krzysztof: Okay. Hopefully, not. Because this is not the main, let's say, value proposition from our platform. But it is also a problem that companies have to deal with: regulations.
Krzysztof: In the current situation in the world, it will probably become more and more severe. So, it is just the beginning of regulations, especially related to data.
Erik: Yeah, great. Well, you're building a great business here. I think, especially this topic of business model innovation for IoT, it's really starting to take off as companies start to see some success cases and open their eyes to what's possible there. It's great to see you're building a really strong enabler here. Anything that we haven't shared with folks yet that's important for them to know about Trusted Twin?
Krzysztof: I think that we covered almost everything. What I can only add is that, if you have any questions — because not everything was explained to the detail — if you need them, please contact me. I will gladly tell you more, to the people who are listening to the podcast. I think the most important thing is that gathering data is one part of the problem. But if you already have the data, then the real problem is how to make a good use of that data.
In the past, most of the people were talking about big data analysis, recommendation, business decisions. But what we can see now is that good use of data is using those data for operational processes and is collaborating on data. This is what we see, and this is what I think will be the future.
Erik: Perfect. Yeah, folks, please reach out to Krzysztof directly or you can always get in touch with me, and I'll put you in touch. Krzysztof, thank you for your time today.
Krzysztof: Erik, the same. Thank you for your time. Are you really in China?
Erik: Yes, in Shanghai.
Krzysztof: Okay. All the best for you then, and all the best for Christmas. Happy New Year for you and for your family. Thanks a lot, Erik.