Podcast Technology EP074 - Industrial edge computing - Martin Thunman, CEO, Crosser

EP074 - Industrial edge computing - Martin Thunman, CEO, Crosser

Nov 06, 2020

In this episode, we discuss trends in edge computing, relationship between the cloud and the edge in on premise environments, and low code development environments to allow operations to participate directly in applications development to reduce lifetime cost of a solution.

Martin is the CEO and co-founder of Crosser. Crosser designs and develops real-time software solutions for Edge Computing and next-generation real-time Enterprise Integration. The Crosser Edge Computing solution offloads Cloud services, provides real-time data processing and decision-making capabilities close to IoT sensors and IoT devices. https://crosser.io/

 

 

Subscribe

Erik: Welcome to the Industrial IoT Spotlight, your number one spot for insight from industrial IoT thought leaders who are transforming businesses today with your host, Erik Walenza.

Welcome back to the Industrial IoT Spotlight podcast. I'm your host, Erik Walenza, CEO of IoT ONE. And our guest today is Martin Thunman, CEO and cofounder of Crosser. Crosser developed streaming analytics and integration software for edge on premise or cloud to enable real time processing of data for industrial customers. In this talk, we discuss trends in edge computing, and the relationship of the edge to the cloud and on premise environments. And we also explored the potential for low code programming to enable operations to participate directly in application development, and to reduce the lifetime cost of a solution.

If you find these conversations valuable, please leave us a comment and a five-star review. And if you'd like to share your company's story or recommend a speaker, please email us at team@IoTone.com. Thank you. Martin, thank you for joining us today.

Martin: Thank you very much. Happy to be on the show.

Erik: I think you actually have quite an interesting background as a serial entrepreneur, member of boards. Can you just walk us through a couple of the experiences that led you now to founding Crosser back in 2016?

Martin: Absolutely. I'm a business guy. I believe people have a certain DNA and my DNA is sales. I'm a sales guy by nature and that's also my background. I have a business degree. But then I started back in 96 with Cisco. And I started basically as low as you can get into Cisco in their inside sales group in Amsterdam. Following that up by moving into their field sales organization, stayed there for about five years, left Cisco 01 to cofound my first company called PacketFront in the fiber-to-the-home space. So we build hardware and software to fiber-to-the-home operators around the world, took that to about 35 million euros in revenue in five years.

The company was hit really hard by the 2008 financial crisis, our customers pulled the brake, and the company was not able to continue its growth trajectory, so it was split in three different parts. But it was a fantastic learning experience by building a company globally selling into five different countries around the world. Moved on, stay with a couple other startups was CEO for Norwegian enterprise software company for about 4.5 years.

Then I met by introduction, my cofounder to Crosser, and he was a developer that over several years had developed a real time analytics engine, and didn't really know what market to go after. And together with a third guy, we look broader in the market and we asked a question. So where is the need, though very modular, very fast, real time analytics engine? And we quickly realized that when the industrial companies wants to move, and connect the different data sources that they have, so machine data, enterprise data, cloud platforms and SaaS application, we understood that in the middle of this, there's a need for a real time analytics engine. And that's kind of how we decided to start Crosser and go after basically the industrial digital transformation that's going on.

Erik: But you've mentioned already real time analytics, you have in the background behind you streaming analytics. So maybe we can as an introduction to Crosser give the audience a very quick overview of what do we actually mean by real time streaming analytics versus maybe analytics that have been used in previous decades just so we can give some context around what's new behind the value proposition that Crosser is bringing to market?

Martin: Yeah, I think there's a shift going on in the market. And I think Analytics is a broad concept as you correctly states. What we do is that we do streaming analytics, meaning that it's data in motion. We don't store data. We don't look at historical trending, that is what the traditional analytics platform have been doing, visualizing data over time. We're taking in real time data, as it is being produced. we're analyzing it in real time. We're looking for patterns. We are creating actions based on the data, and so forth. So data notion versus data at rest, that's kind of the distinction that I think makes more sense.

Erik: Who are the companies that you're typically working with? Is this a fairly horizontal solution where anybody that has data in motion could adopt this for their application? Or do you focus on very specific verticals?

Martin: Tech is tech, right? And if you have a pump, and the pump is in pulp and paper, or if it's in a building, or if it's in a city, somewhere, it's really the machine data looks the same. So in many ways, it is very, very horizontal. However, as a small company, when you enter a market, you need to be really good in something. So you need to build an expertise in one segment.

So although the platform is horizontal, and you can take in e-trade, streaming data or banking, that's possible from the platform perspective. We focus purely on the industrial segment. So those are manufacturers with production sites, plus automations. It’s managed service providers, it's machine builders, or OEMs. Those are the typical basically industrial companies in any type of form.

Erik: If you look at industrial companies, there's also many different types of use cases where streaming data could be useful, many different types of problems that they could be addressing. But I suppose you also have a focus area here in terms of the types of equipment that you'd like to work with, or the types of situations that are the bigger opportunities or the areas where there's a bit more pain. What are the focus applications? Or what would be the top three areas where somebody might find a lot of value in streaming analytics in the industrial space?

Martin: On one hand, we want to provide full flexibility for our customers. Meaning that we want to provide like a big Lego box of the building blocks that the customers can put together easily, and build and address whatever use case they have. There's a challenge to that model. And a challenge, we know that have been around for a while. And when we were young, we got the Lego box of thousand pieces and we were told to build something.

But then my case, they got this Lego Star Wars prepackage with all the instructions. And they build amazing things and tailor-made pieces that only fit for this specific use case. And what we have realized here is that just providing the box of pieces actually limits our customers. So what we've done is to starting to package use cases, something that we call flow ups, which basically it's a way to simplify even further for customer and give them ideas and examples of what they could build with this.

Coming back to your question, so on the top three, I think a lot of the companies start with just getting access to the machine data. Just you have factory floors with different generations of machines, some could be decades old, some could be brand new. So it's a challenge in itself, just getting hold of all the machine data. But the next step is then to harmonize that data and make data basically do a transformation in return and do action on top of that.

So just getting hold of the data, structure it, clean it, get away with raw data that you're not interested in, filter away that stuff, and then move what we called relevant data to the next layer in the text that could be to the cloud platform to your industrial IoT solution, or your own storage. So you can then start to analyze and get insights from the data. But you want to have good data, structured data, relevant data. And that's kind of the first use case where many start their industrial IoT journey. So that's one starting point.

Another part that is very common is factory floor automation, so machine-machine logic: so acting on the machine data from one machines creating trigger into the next machine to do something. Even closed loop automation is another well used solution that's coming out. So basically, taking machine data and changing settings in another machine to just change whatever the machine is performing could be how much liquid is putting out in the process or something else. So automation, using this solution as a mini control system on the factory floor, that's another very common application.

Then you have machine data to enterprise data integration. So, acting on machine data, creating a trigger that sent into an ERP system, for instance, to create an automatic work order for a technician to go service the machine based on health monitoring. So our typical use cases where company starts today.

Erik: You mentioned a couple things there. One thing that kind of jumped out is the question of the readiness of the customer to prepare on the one hand, the data, and on the other hand, the processes. Because a typical problem here is trying to automate a process, once you have the capability before that process is properly structured to enable automation. Have you found that you are working more with the Fortune 2,000 type companies that have a lot of maturity in their infrastructure, or are there medium sized companies that are finding a lot of value in being able to avoid the need for a really heavy traditional architecture and do things lighter using IoT solutions? Who are you finding that ready for this? And then maybe we can also after that talk a little bit about the challenges, and just having the foundational infrastructure and processes in place to be able to make sense of or make use of streaming data?

Martin:  We actually did an analysis of this just a couple of weeks ago, and it's basically very evenly distributed in our project base. So we have real enterprise global enterprises, so a billion dollar plus type of companies, then we have half a billion or a billion, and then 100 million to 500 million in the other bucket is pretty evenly distributed.

I would say that even the larger companies often start with a part in a division or a plant or some area. I think, we often make the mistake in thinking that big logos means global scope. Most companies are pretty distributed. They have grown through acquisitions. They have different plants that produce different things. And those plants are often pretty autonomous. So even if it's big companies, the starting point is mostly kind of a medium size. Even if we look at the kind of enterprise 10 billion plus company, the starting point looks like much more as a mid-enterprise start. It's very seldom we see today this big company that come out and say now we're going to do an RFP and we're going to do a global rollout for all our sites at once.

And I think it's come back to you mentioned readiness, we're still fairly early in the market. Companies understand the value proposition today. I don't think we need another webinars to discuss why [inaudible 14:52] makes sense. I think boards around the world are talking about that digital transformation of industry is a hot topic.

I think, what's happened now with the pandemic has actually accelerated those discussion about digital transformation. But then how do you go from a vision to [inaudible 15:23] on you with a clear outcome? And I think this is where company today struggle a bit. Everyone in the industry, we have a lot of responsibility to help the end market through this jungle.

Erik: When you first begin working with the new customer, do you often find yourself in the position of advising them on what infrastructure they would need in terms of sensors or field connectivity to get data out of equipment, or do you find that the companies you're working with typically have the data, they're just not combining it maybe in ways that would provide kind of maximize the value right now?

Martin: So we're talking a lot about OT and IT convergence and collaboration. I think, the OT people, they know their stuff normally. They are really, really good in understanding how to get the data out of all their machines. They don't always have the tools to do it, but they know this. So we don't feel that we end up in advisor situation on that part. They just don't have the tool. But there is a gap there, of course, between getting data out of the machines, and then making sense of the data, and creating a business value out of that data. And that's the piece where we find ourselves going into a lot of discussions. We do a lot of use case discussion with our customers, and advise them on what the possibilities are for them to find value out of their business or in the machine data.

Erik: Maybe we can talk about who your customers are a little bit from a different perspective, from a stakeholder perspective. So you've already mentioned that typically, this is somewhat of a bottom-up approach, you'll be working maybe with a factory or a specific business, rather than top-down from corporate headquarters. But who would you be engaging? Is this the factory GM? Is it the engineering department? Is it IT? Who would be the typical leader in terms of who's driving the discussion, the prioritization? And then is the budget also coming from that? Or does it become a little bit complicated at the stakeholder level of who drives into advisor’s decisions?

Martin: In enterprise sales, it's always complicated with roles. So it's multiple stakeholders in this. The beauty with working with enterprises is that no company is the same. But in general, I would say the initiatives are driven by mid-management roles. So that could be a smart factory director, the IT director. It could be senior automation engineers. A lot of the cases actually enterprise architects, they're looking at putting all the pieces in the textile together, that's the role.

So I think mid management is the focal point. But then you have, of course, the hands on engineers, most on the IT side, the data science side, as well as on the OT side. And then we have, of course, the budget owners that sometimes is the mid-management but often isn't. So there's economic buyer that could be a CIO or a plant manager or something like that. So it's always a mix.

Erik: And then maybe the follow up question here would be what are the goals that these stakeholders have? But if we just think about it from like high level perspective, revenue versus cost, I mean, my experience right now is that most of the use cases that we see around this, especially when you're talking about factory or industrial are focused on cost reduction or risk reduction, maybe with EHS issues.

We see a fair amount of interest in topics related to potential revenue growth, things like mass customization which could provide a new value proposition to the customers of the manufacturer, but not so much actual deployment on those topics. How do you see this, is it vast majority cost reduction simply because it's easier to make the business case there? Or do you see a wider array of underlying goals behind the deployments?

Martin: I think cost reduction is the main driver in this. Of course, everyone are looking to reduce downtime of assets and machines. That, of course, is a cost reduction part in that. But it's also a revenue piece on that because if you can have a higher utilization of the machines, you can produce more per asset. So there is a revenue aspect on that topic. But all increase digitalization and optimization are often about improving processes, reducing manual steps, shorten cycle times. So there are other aspects in this.

Also, spoke to one plant manager, he had a different challenge that he was looking at. And he said, Look, within 3-5 years, 40% of my maintenance engineers are going to retire and I don't know how I'm going to fill those positions. Because the young generation, they don't want to be maintenance engineer in a pulp and paper industry somewhere. That's not cool in Instagram. So his challenge was I need to find a way to digitalize big part of their job description because otherwise, we're going to be a big problem because we can't fill those positions. So that's actually an interesting aspect of have a goal within digitalization here in the industry.

And I've talked to several that are facing same challenge or similar challenges with an ageing workforce and not being able to recruit for those roles.

Erik: So factories that maybe run very well, but due to a very experienced workforce that is going to have a life cycle, and then the question is okay, once yeah, these people retire.

Martin: Yeah, they have engineers. So I've been around for 30-40 years, they go down the shop floor, and then they stopped. And I said, well, something's wrong with this machine because I can hear it on the sound. How do you replace that experience and know-how? They're dependent on that type of skill. So now they look okay, how can we use modern technology?

Erik: But one last question on the business side and it's a question of system integrators. Are you generally engaging directly with your customers and managing that relationship, including, I need deployment support? Or are system integrators somehow playing an important role here and helping to bring Crosser to market or customized applications for after your customers?

Martin: I would say both. So we have customers that come to us that have the tech stack figured out already. But they have a hole in the tech stack, where it's about edge analytics and factory floor integration and automation for these use cases. And our brand promise is that we enable a self-service model. So we have typically a very light onboarding training that we do completely online, then the customers are using our tools themselves in our solution. We engage directly.

But many customers start without having the whole tech stack figured out. And in doing that, here is where systems integrators play an important role. Because some of these customers, they might not have the bandwidth because they have just the team that they have, or they don't have the experience that they want to have outside support in building out this new solution.

So what the IoT gateway should I run? What type always should I be running? What's the cloud platform? What the analytics? There's so many layers in the tech stack to figure out, and this is where I believe the SI and consultants plays super important role in helping. So we actually do both. We work with direct often when customers come to us, or we work with systems integrators.

Erik: So let's now turn to the technology side of the business. We've already defined streaming analytics, if you can do the same favor here and define edge computing in particular, the relationship between edge computing and cloud computing and also maybe edge computing as we think about it now versus maybe on-premise servers on premises computing that might have been done over the past few decades. What is edge computing? How does it relate to cloud and on-premise servers? And then why is this an important development?

Martin: So there's a lot of, I was say, confusion about the concept of edge. I was invited to a conference, some time back in London to speak about edge computing. I was on this panel and it was a super mixed panel. And there was some people are talking about in data center edge. So for them, they said we’re an edge computing company, we’ve eight data centers in the UK.

So the edge really is processing the data as close to the data source as possible for a number of reasons. It could be for cost reasons, because there's a lot of raw data that has been produced. And then you can do 99% or more cost reduction. So you save cloud cost and bandwidth costs to go mobile and remote assets. So preprocessing of data locally, that's one part.

Another reason could be latency, you want to do automation, you want very short latency, no reason to send something to remote data center and back again. So that's the third reason. Security is a big topic. A lot of the use cases, especially on the factory floor, is within the factory floor, so it's inside the firewall. So taking stuff outside the firewall to cloud and back again does not make sense from architectural perspective.

So one of the questions is what's different than between on-premise and edge analytics? And in some way is the same, you're processing data locally. But I believe that the modern edge analytics and I think why people are using a new term of this is that we're leveraging modern cloud technologies but deploying that locally on the edge or on the factory floor. So this is scalability. This is simplicity. It's in modern technology. It’s a new tech stack as that's being developed.

So many are talking about the edge being part of end-to-end solution where you have a cloud and the edge being part of a stack. And for some companies, it's the cloud; for others, it's a data center, if they haven't moved to the cloud yet. But it's part of a solution where it's on-premise traditional or older, on-premise solutions are typically independent applications that run in their own silos.

Erik: On-premise, typically, are going to be siloed, or a centralized server. We've seen some companies that are really emphasizing the ability to deploy code in multiple locations. We put some of our code base in the cloud, we put some of it on the server, we can also put some of it in gateways or even on edge devices, even on sensors that have very limited compute power. I mean, how critical is this? Is this something that for Crosser’s part of your value proposition? Or is this a trend that you see to be important for edge development?

Martin: I'm a technology company, I would love to say that no, everything has to be done in the edge. But that's not just simple true because this depends on the use case for each specific customer. Sometimes it makes more sense to process part of the data in the cloud. Sometimes it makes sense to do it locally in the edge. Sometimes it makes sense to do it on the device itself, on the sensors or on the unit.

I think the winners in this is actually the end users that will have a tailored solutions to whatever use case they have [crosstalk 29:25] then solve have different deployment option is going to be there and I think is good.

Erik: Well, why don't we get now into Crosser’s tech stack? How would you define your technology stack?

Martin: Yeah, so our tech stack has two main parts. We have our Crosser node, which is the real time analytics engine that gets installed on an edge gateway, on factory floor server, on a VM somewhere, or also could be installed in a VM in the cloud for that matter to do streaming analytics. But typically it's in the edge some somehow.

I think what we have done a little bit different from others is that we have built a Crosser cloud, which is a design and management cloud. So we have actually separated the design from the node itself. So that's done centrally in the cloud platform using a drag and drop interface where we have the pre-built modules. So it's a basically low code solution for industrial IoT.

So those two pieces in the tech stack works together. Our Crosser cloud did not process any customer data: it's a control plane only. It gives a couple of advantages. One advantage is that you can do design once, and you can deploy too many at once. So if you have 110, or 10,000 of nodes deploy that is running the same type of processing flow, it's one operation to update that and to roll that out. So you design and work in the cloud and you roll it out to all nodes simultaneously.

So, that comes from our experience that I mentioned before in building solutions to telcos and to broadband operators. Scalability is key. What's often forgotten is that only 10% of the total lifecycle cost or total cost of ownership is actually the software cost itself is to hit the manual cost over the lifecycle that is the major part of your total cost of an IoT project.

So we are trying to address that part, both in the design phase by having a low code platform that empowers existing OT teams, existing IT teams to innovate and do much more than they otherwise could do, but also to automate and all this lifecycle task of deploying in mass upgrading and task making all these changes and continue to innovate. So those are two pieces, and those then from the solution, from Crosser.

Erik: This low code topic is as critical, as you said. There's a tremendous amount of hidden costs that doesn't often get considered because internal cost, right? So you do an RFQ and you benchmarking. For you what is low code mean, if I'm an OT engineer, not with a lot of IQ background, what would be the learning curve in order to be able to either develop, or at least modify applications on the system?

Martin: Yeah. So as I mentioned earlier, our ambition and our brand promise is that we empower self-service. So if you're an OT or IT engineer, you should be able to log into the platform through learning documentation and videos, start learning and building yourself without a consultant support from us, etc.

Typically, what we do is that we have an onboarding training. So we do a couple of hours training session online and then we train the users. We even run regular online training. So any listeners that wants to learn about edge analytics and take a training course about that, we have an online free online course that you can sign up for. So learning about data logistics in general, and but also, of course, across a platform.

And the whole concept of this is that we have this building block that we called modules, so there're modules for data connectivity. So connecting to different data sources, and here you have, typically, it's 80-20 rules in the machine world. So there are a couple of protocols that supports most of the machines. So it's like OPC, it's mod-bass MPTT, HTTP as seamless as S7 LM Bradley rest API's, those would cover most of it. Then there are a couple of others. It's a long tail and all those who build and reprioritize based on projects.

So connectivity is where modules for connecting and getting data in. Then we have module for do data transformation and data harmonization. There are module for intelligent logic, so taking  action, analyzing and taking actions on data, so it’s a lot of modules for that. There's also empty code module. So you can drop your own code in. If you have built an [inaudible 35:06] algorithms and own MLM algorithm or model, you can deploy this in combination with prebuilt module from us.

The intelligent logic is the third group. We have an action to workflows. So what am I going to do if I find an anomaly? And then integration back to machines to enterprise system to cloud providers or to SaaS applications. Those five groups all of module fixes, customers are then choosing the modules they want to build their use case, or they choose these flow apps I mentioned earlier as a starting point, as a template to start building a specific use case. So very, very hands off from our perspective.

And I think we are trying to push this now even further. We are starting with a next generation because of course, we took this to market, which we believe is the leading platform by far. But we have learned things that we can automate them and do an even more simple experience for the end user. So we're going to want to work on push this even further. That's our planned promise really in this market compared to the rest of the industry.

Erik: This is an interesting topic for you're a deep tech company that simplification is maybe the key variable in here. There's a lot of open source machine learning technology out there. So there's a lot of very powerful tools that people can use, but actually being able to use them and make them practical is the real challenge. Is that how you see the situation?

Martin: Yeah, it is. And it comes back to resources again. One plant manager that I spoke to, he told me, Martin, right now there's 1.5 million job openings in the world for software developers. I'm in manufacturing site in a midsize town here in Europe. So how am I going to recruit top developers in order to build this myself, or build on top of platforms that empower code or empower developers? It's getting tougher for each year in finding and retain good people.

So, the people issue and empowering the existing team that you might not think have those capabilities to start with but empower them to actually do much more, but also to help in this OT and IT bridging and collaboration. So if you think about what we described before, apart from the five different steps, the first steps, so getting data in understanding the data, doing data transformation, and harmonization, that's the OT, the machines.

But then you have intelligent law, and you have algorithms and actions, those could be subject matter experts, or it could be data scientists. Then you have integration workflows to other system, enterprise system, cloud platforms, house applications, and that is the IT people domain. So for the first time, they have one platform they can collaborate, and they can contribute in different parts of the project and they can visually see what the other team members have built, which provides a lot of benefit.

And also if someone builds something leaves, is there, is graphical, they can see everything that has been built. There are some settings, like putting in settings in your smartphone. So it's super simple for everyone to inherit something that has been built by someone else. That is hard with code.

Erik: I have a very simple perspective. I'm not a data scientist at all. So when I think about data problems, I think about okay, there's one type of problem that you can solve with logic algorithms. So some domain expert understands the logic and it just has to be transferred to an algorithm and then automated. Then then you have your big data type situations where maybe it's a quality issue, so quality detection or predictive maintenance or something like this, where you have a lot of data points and you have enough instances of a fault that you have good training data.

So situations where it's not stable enough that a domain expert can go in and program the logic. But there's also not a huge set of nice clean regular data to train on, and you have this smaller messy data. Does this make sense to you? And then where do you see the big problems being solved right now? Is it more on the big data sides of the areas where you have a lot of data but it might be complicated, but there's a lot of it you can run machine learning or deep learning algorithms there? Or do you also come across a lot of small data issues where maybe a fault only happens once every six months and so it may be hard to test on those, or for whatever reason you have very limited data sets to try to figure out?

Martin: Yeah, I think my take on this is that the software vendors in this, now we're talking so much about AI and ML, and that's the hype right now. We believe that we're pretty far away from the situation where there are generic ML and generic AI that you can use broadly in the industry. So it comes down to very customer specific, very customized, and use case specific models. That is easier to do when you have like it's one machine, and you can figure out that specific machine. But when you're looking in processes, that's becomes very, very complicated.

You can find specific individual use cases like quality inspection with vision cameras, for instance, and processing that in the edge for specific alerts. So that can be trained to detect the specific scratches on the surface or what have you. But when it becomes much broader, it's harder. And I think we're hyping this so much from the industry side that I think the customers believe that they need to start with the AI/ML side. But there's a lot of business value that can be generated just starting with fixed algorithms, fixed conditions, fixed filtering rules, and creating triggers on individual data points from machines without going through the whole ML models and AI algorithms.

I think the most important is to get started and to eat the elephant in pieces. Looking at sensor data, ML can definitely play an important role. But it could be an optimization. You could possibly extract 80% of your business value without that. And then the last 20%, you can get with machine learning models. And I think a lot of the companies are hesitating to get started sometimes because they believe that they need to have these predictive analytics models that [inaudible 43:35]. Well, there are other ways in between that that create business value.

But is definitely an area we have a lot of hope in because it's going to be great when that can happen, meaning that when ML and AI algorithm is more generally available. And there's so many initiatives going on right now in the industry to simplify building those models. All the big gear cloud providers have project and solution for this is in a few years time, fast forward five years, I believe it's going to be much, much easier to build those low models.

Even if you have the model, the challenge is going to remain. What are you going to do before the model? And what are you going to do after the model? So and we have a strategy that we called bring your own AI, so we don't help customers develop models. But we say you know your environment, you build it, you find a partner. But also with what we call edge ML ops, meaning how can you deploy out the ML algorithms out to the field in mass? So, if you deploy hundreds of thousands of units and then retrain the model and redeploy that, how do you do that cost efficiently and securely? It's a super important and exciting area that I think is going to create value over time. But it's right now there's not too much deployed.

Erik: And there's a lot of effort that goes on in a production facility that doesn't require super sophisticated algorithms. It just requires intelligent decision making, but decision making that people already know how to guide. But let's then Martin, look at a specific case. And maybe you can walk us end-to-end, what is the problem this company was facing? How do they deploy? And just walk us through. And if you can put in some of the details, what's the timeline so people can get a perspective of what would this actually look like if they were to deploy Crosser on their facility?

Martin: Absolutely. So actually, I was approached by the CEO of a German machine biller in the warehouse automation space called Gebhart. So they reached out to us because in their case, they wanted to gain a competitive advantage. They have a vision that by having a very advanced modern digital connected solution, they can stand out in the crowd and they can offer much better experience for their end customer.

So warehouse automation, those are the cool robots that you see in pictures going up and down the aisles in warehouses, and picking stuff. So those are like shuttles, I believe, they call them. So they wanted to have real time monitoring of those shuttles. And they figured out by analyzing vibrations, in real time they could identify pattern and predict service needs or maintenance needs.

So there was a couple of objectives that they had, a competitive advantage, number one, but they're also improved customer experience by avoiding unexpected breakdowns, but also a lower maintenance costs by having scheduled maintenance windows. It’s like with your car every year, you need to take it to the garage, even if there's no nothing wrong with it, you do the service every year, because that's what's inside.

With machines, say that you would have three service windows in the year, what if you could only come away with two? Maybe the machine is fine, and two is enough, then you would not have to take down that machine for one service window and that would increase uptime, and that's money. A lot of end customer value as well has maintenance cost savings, and their challenge was how do I process in real time vibration data? Because vibration data is high frequency data, a lot of data points coming out.

So they realized that they wanted to run edge analytics to have real time processing locally, and then only send process relevant data to their industrial IoT platform. They contacted us, within a month they were up in a trial. They were on a one month trial and then we signed agreement. We got them trained online and then they were up starting to build their use cases. So it was a couple of months from the first contact from their CEO until we had the project live.

Typical challenges that we run into and this is across all users, it's firewalls. So they deploy inside the warehouses at the customer sites, so inside the customer firewall. There is always firewall of challenges. Those are our tweaks. We need to tell the firewall administrator to do some changes in settings. But we're always run into that, no matter where we are, that's a challenge. They should have control over the firewall. But we pretty quickly tend to address those challenges. So that's the project and now they're deploying this to their new customers, you're just going with this.

Erik: So this is one of those revenue opportunity cases, in addition to having the reduced maintenance costs. Do you know how they modified their business model or their pricing model around this at all? So, for example, how they might build customers for maintenance or kind of structure that aspect of the business or right now has it been just we deliver better service to our customers, and we reduce our operating costs?

Martin: For them, they took the decision to have this as a competitive advantage. But this basically, it's like the cars. The whole car industry is now moving into self-driving cars. It's driven by Tesla because they put this out as a competitive feature, they wanted to have a better end customer experience. The same here, they want to be forefront of having a digital offering. It's not just their automation platforms, which are great in themselves. But by making them digital and completely connected, they are basically positioning themselves. And their goal is to win more projects, that's based on this. So that's their big ROI that goes way beyond saving cloud costs or maintenance costs, etc.

Erik: So I'm curious, in this case, are they selling their core service as kind of a CapEx or are they selling it as like OpEx service where they would deploy the equipment on their customer sites, but they would then operate that as a service? Do you know what that model is?

Martin: I believe it's a CapEx model still. I think this whole outcome based pricing like Rolls Royce pioneer weather engines, decades ago, it’s still something many are talking about in the industry, but few have  gone that route. So this is still a CapEx model,

Erik: I guess you take them one step towards that, or at least it makes it much easier if they have the data and they actually know what's the real cost here?

Martin: I think it's going to be industry by industry. Some industries are going to see the advantage of that, others will not see advantages of that model. So it's going to be super interesting in the next 10 years in this market, for sure.

Erik: But could you give us a high level understanding of what would be a reasonable budget if somebody started wanting to start working with Crosser. Maybe the example you just gave is maybe not the ideal one because it's a little bit unique to this situation. But if we were just to think of a medium sized factory, and they wanted to deploy it on the shop floor to start managing shop floor data, would be a rough starting point to have something operationally up and running?

Martin: As I mentioned before, we encourage our customers to start small and grow big. So we also, of course, have a business model that is aligned with that. So the starting point is pretty low. The average starting point is somewhere between 10,000 and 40,000 euro per year, depending on the scope of the project. And then it depends on how much they roll out and how many use cases, etc. So it's really, really small investments that they need to make.

Erik Well, Martin, really appreciate you taking the time. We're at the hour here. Are there any key questions that we haven't covered, anything else that you want to share about maybe upcoming launches, new features anything exciting coming out of Crosser that you'd like to share with us?

Martin: I think, as I mentioned before, we have realized the power of the low code and we want to push this even further. And one of the things we're also moving into now is that's driven by customer requests is that telling us like, okay, Crosser, we have machine data in and then we have all the other data sources here.

But what if I would take enterprise systems on the inputs and then doing all the automation and analytics in the middle and then have another enterprise system, could I do that? Can I do salesforce cloud to on-premise SAP and have automation and logic and workflows in the middle? Basically, what we call intelligent process automation to combine that basically have this real time enabled integration, and that something that we are now working to simplify even more for our customers. So that's going to be next super exciting part for us. So, not only machine data to enterprise, but to enterprise to enterprise as well.

Erik: Well, Martin, I wish you tons of success. And I have no doubt that you'll build Crosser into one of the great new industrial tech companies. Thanks for taking time to speak with us today, really appreciate it.

Martin: Thanks for inviting me here. It has been a pleasure.

Erik: Thanks for tuning in to another edition of the industrial IoT spotlight. Don't forget to follow us on Twitter at IotoneHQ, and to check out our database of case studies on IoTONE.com. If you have unique insight or a project deployment story to share, we'd love to feature you on a future edition. Write us at erik.walenza@IoTone.com.

test test