This profile is not managed yet, if you would like to manage
this profile, please contact us at
Intellegens Logo


United Kingdom
< $10m
< 10
Open website

Intellegens is a spin-out of the University of Cambridge to develop and commercialise novel Artificial Intelligence (AI) software. Intellegens has developed proprietary algorithms which allow neural networks to be trained on a fragmented or incomplete database. Intellegens has already successfully deployed its code in two diverse applications: drug discovery and material design, where it has significantly cut customers’ costs by reducing the number of experiments thereby shortening development cycles and offering accelerated time-to-market. The company was founded by Dr Gareth Conduit, a Royal Society Fellow at the Cavendish Laboratory, and Ben Pellegrini, an expert in big data and cloud-based platforms. The Intellegens approach can be applied to many other data domains. Current opportunities include health, autonomous cars and retail. To enable wider uptake of this approach, Intellegens is developing an online offering with additional funding from Innovate UK.

Read More
Intellegens is a provider of Industrial IoT analytics and modeling technologies.
Analytics & Modeling
Machine Learning
Use Cases
Immersive Analytics
Predictive Maintenance
Data Science Services
Intellegens’s Technology Stack maps Intellegens’s participation in the analytics and modeling IoT technology stack.
  • Application Layer
  • Functional Applications
  • Cloud Layer
  • Platform as a Service
    Infrastructure as a Service
  • Edge Layer
  • Automation & Control
    Processors & Edge Intelligence
  • Devices Layer
  • Robots
  • Supporting Technologies
  • Analytics & Modeling
    Application Infrastructure & Middleware
    Cybersecurity & Privacy
    Networks & Connectivity
Technological Capability
Number of Case Studies2
Novel Deep Learning Approach for Predictive Maintenance and Process Optimization
Most organisations apply a “Reactive Maintenance” approach to their processes, in which repairs and replacements are made to the equipment after a failure occurs. It costs around 10x more to repair a machine after it fails, not to mention the direct impact on revenue and customer satisfaction. Through “Preventative Maintenance” equipment is repaired or replaced at pre-set time intervals in order to avoid failure. Whilst this approach reduces unplanned downtime it is expensive as these scheduled repairs take place when there can be nothing wrong with the equipment. However, the benefits of predictive maintenance are significant, so it is becoming the preferred method for manufacturers, enabling organisations to foresee and schedule repairs and replacements when needed, achieving 100% operational uptime of the equipment.  One challenge for traditional machine learning in manufacturing is that techniques require clean and complete data. However, manufacturing and process data can be sparse and noisy.Currently, it is difficult for engineers to access and interpret production process data, they rely on personal experiences and opinions to modify process parameters. This leads to inconsistent and potentially suboptimal decision making, and moreover increases the risk of process failure, increasing associated time and costs. The production line is especially difficult to model using standard techniques due to the inherent time lag and inertia between changing operating parameters and their effect. Costs associated with waste materials and failed production could also be significantly reduced with the application of relevant and innovative deep learning technology to design production processes more efficiently.
Optimizing Tooling for Composite Drilling Using Deep Learning
Laminated fibre-reinforced polymer (FRP) matrix composites are increasingly used in industries such as aerospace due to their excellent mechanical properties and highly-tailorable design. However, this tailorability can negatively impact costs, productivity, and sustainability during manufacture, especially in machining where FRP part-specific defects occur. Process uncertainties resulting in large, unpredictable defect generation are a common cause for prescribing overly-conservative cutting tool use limits, based on part quality criteria. Due to the wide array of tool designs and workpiece material configurations available, an application-specific approach is required to identify the most effective cutting strategies. Optimal cutting parameters can be found using an exhaustive, wide-boundary, DoE-based approach, with slow and costly testing required to identify absolute tool life limits. The challenge was to establish a novel machine learning-based method to predict tool life from start-of-life performance data, reducing experimental time and cost. The project was particularly challenging, because the original dataset was sparse, with 82% of the target data missing.
Download PDF Version
test test