AI
Recent Posts
Big Tech eyes Industrial AI and Robotics
Date:
An overview of Big Tech’s in-roads into manufacturing and industrial AI. From bin picking to robotic wire arc additive manufacturing (WAAM) the pace of industrial technology advances continues to pick up as digital transformation takes hold.
Assembly Line
🛢️🧠 ENEOS and PFN Begin Continuous Operation of AI-Based Autonomous Petrochemical Plant System
ENEOS Corporation (ENEOS) and Preferred Networks, Inc. (PFN) announced today that their artificial intelligence (AI) system, which they have been continuously operating since January 2023 for a butadiene extraction unit in ENEOS Kawasaki Refinery’s petrochemical plant, has achieved higher economy and efficiency than manual operations.
Jointly developed by ENEOS and PFN, the AI system is designed to automate large-scale, complex operations of oil refineries and petrochemical plants that currently require operators with years of experience. The new AI system is one of the world’s largest for petrochemical plant operation according to PFN’s research, with a total of 363 sensors for prediction and 13 controlled elements. The companies co-developed the system to improve safety and stability of plant operations by reducing dependence on technicians’ varying skill levels.
How generalized AI outperforms specialized models
IBM and NASA Open Source Largest Geospatial AI Foundation Model on Hugging Face
IBM (NYSE: IBM) and open-source AI platform Hugging Face today announced that IBM’s watsonx.ai geospatial foundation model – built from NASA’s satellite data – will now be openly available on Hugging Face. It will be the largest geospatial foundation model on Hugging Face and the first-ever open-source AI foundation model built in collaboration with NASA.
The model leverages IBM foundation model technology and is part of IBM’s larger effort to create and train AI models that can be used for different tasks and apply information from one situation to another. In June, IBM announced the availability of watsonx, an AI and data platform that allows enterprises to scale and accelerate impact of the most advanced AI with trusted data. A commercial version of the geospatial model, which is part of IBM watsonx, will be available through the IBM Environmental Intelligence Suite (EIS) later this year.
Behind the A.I. tech making BMW vehicle assembly more efficient
Xaba and Lockheed Martin Collaborate to Test Cognitive Autonomous Robots in Airframe Manufacturing
Xaba, developers of the first AI-driven robotics and CNC machine controller, and Lockheed Martin recently completed a collaboration to evaluate the automation of crucial manufacturing operations using the global aerospace company’s industrial robots integrated with Xaba’s proprietary physics-informed deep artificial neural network model, xCognition.
Xaba and Lockheed Martin identified a use case focused on a typical robotics work cell used in any aerospace factory to test how Xaba’s xCognition “synthetic brain” could empower a commercial robot with greater intelligence and understanding of its body and the task it is about to execute while ensuring required quality and tolerances are achieved.
Samsung to apply AI, big data tech to entire chipmaking process
In partnership with the Samsung Advanced Institute of Technology (SAIT), Samsung’s Device Solutions (DS) division, which oversees its semiconductor business, will lead the company’s efforts to broaden the use of AI throughout the chipmaking process, sources familiar with the matter said on Monday. Under the plan, Samsung will seek to apply AI tech to DRAM design automation, chip material development, foundry yield improvement, mass production and chip packaging.
Specifically, the company hopes its AI tech will determine the cause of unnecessary wafer losses, optimize the AI-based manufacturing process and analyze DRAM product defects, sources said.
How recovery facilities improve performance with AI residue line analysis
Residue lines hold a lot of potential for recovery facilities. That’s because they often contain more of your raw materials than you’d like, which could have been turned into revenue. Assessing the residue stream is like a blood test for your plant, if there’s a lot of valuable material on the conveyor belt, your operations need a check-up.
The more material a facility recovers, the less it sends to landfills. If this plant recovered more valuable material, they wouldn’t just make money on their products — based on UK fees, they’d save £56,000 a month by cutting unnecessary gate fees.
BMW Paint Shop with Artificial Intelligence: Automated Rework
In a World First, Yokogawa’s Autonomous Control AI Is Officially Adopted for Use at an ENEOS Materials Chemical Plant
ENEOS Materials Corporation (formerly the elastomers business unit of JSR Corporation) and Yokogawa Electric Corporation (TOKYO: 6841) announce they have reached an agreement that Factorial Kernel Dynamic Policy Programming (FKDPP), a reinforcement learning-based AI algorithm, will be officially adopted for use at an ENEOS Materials chemical plant. This agreement follows a successful field test in which this autonomous control AI demonstrated a high level of performance while controlling a distillation column at this plant for almost an entire year. This is the first example in the world of reinforcement learning AI being formally adopted for direct control of a plant.
Over a 35 day (840 hour) consecutive period, from January 17 to February 21, 2022, this field test initially confirmed that the AI solution could control distillation operations that were beyond the capabilities of existing control methods (PID control/APC) and had necessitated manual control of valves based on the judgements of experienced plant personnel. Following a scheduled plant shut-down for maintenance and repairs, the field test resumed and has continued to the present date. It has been conclusively shown that this solution is capable of controlling the complex conditions that are needed to maintain product quality and ensure that liquids in the distillation column remain at an appropriate level, while making maximum possible use of waste heat as a heat source. In so doing it has stabilized quality, achieved high yield, and saved energy.
Unlocking the Value Potential of Additive Manufacturing
Transitioning to AM requires not only a change in mindset but more importantly, the ability to quickly and easily identify which parts are best suited for the additive manufacturing process. This is where AI and machine learning are now bridging the gap between traditional AM –where most of its value materializes in the form of functional prototypes – and more advanced additive manufacturing operations. “We have upwards of a million part numbers,” said Werner Stapela, head of global additive design and manufacturing at Danfoss – an international leader in drives, HVAC and power management systems. “So, it would be impossible for us to manually analyze each one to determine whether additive manufacturing would either add value or reduce costs.”
“We have been utilizing 3D printing for decades, mostly for prototyping, but the Castor3D software allows us to focus on our end components and more specifically the costs associated with that,” added Stapela. The software’s algorithm and machine learning can scan thousands of parts at once by analyzing CAD files. It evaluates five factors: materials, CAD geometry, costs, lead time and strength testing to identify suitable parts for AM. The software can also make design for additive manufacturing (DfAM) suggestions regarding part consolidation and weight reduction opportunities.
Can Large Language Models Enhance Efficiency In Industrial Robotics?
One of the factors that slow down the penetration of industrial robots into manufacturing is the complexity of human-to-machine interfaces. This is where large language models, such as ChatGPT developed by OpenAI, come in. Large language models are a cutting-edge artificial intelligence technology that can understand and respond to human language at times almost indistinguishable from human conversation. Its versatility has been proven in applications ranging from chatbots to language translation and even creative writing.
It turns out that large language models are quite effective at generating teach pendant programs for a variety of industrial robots, such as KUKA, FANUC, Yaskawa, ABB and others.
Enabling Certification of DL-Based Software Components
Artificial-intelligence software, particularly deep-learning (DL) components, is currently the most advanced and economically feasible solution for achieving autonomous systems, such as autonomous cars. However, the nature of DL algorithms and their current implementation are at odds with the stringent software development process followed in safety-critical systems like cars, satellites and trains.
SAFEXPLAIN, a project funded by the European Union, aims to bridge this gap to enable the certification of DL-based software components, including those that inherit high-integrity fail-operational safety requirements. SAFEXPLAIN considers three pillars simultaneously:
- DL-based software components
- Certification practice against functional safety standards
- Efficient execution on commercial platforms
The future is now: Unlocking the promise of AI in industrials
Many executives remain unsure where to apply AI solutions to capture real bottom-line impact. The result has been slow rates of adoption, with many companies taking a wait-and-see approach rather than diving in.
Rather than endlessly contemplate possible applications, executives should set an overall direction and road map and then narrow their focus to areas in which AI can solve specific business problems and create tangible value. As a first step, industrial leaders could gain a better understanding of AI technology and how it can be used to solve specific business problems. They will then be better positioned to begin experimenting with new applications.
Computing With Chemicals Makes Faster, Leaner AI
A device that draws inspiration from batteries now appears surprisingly well suited to run artificial neural networks. Called electrochemical RAM (ECRAM), it is giving traditional transistor-based AI an unexpected run for its money—and is quickly moving toward the head of the pack in the race to develop the perfect artificial synapse. Researchers recently reported a string of advances at this week’s IEEE International Electron Device Meeting (IEDM 2022) and elsewhere, including ECRAM devices that use less energy, hold memory longer, and take up less space.
A commercial ECRAM chip that accelerates AI training is still some distance away. The devices can now be made of foundry-friendly materials, but that’s only part of the story, says John Rozen, program director at the IBM Research AI Hardware Center. “A critical focus of the community should be to address integration issues to enable ECRAM devices to be coupled with front-end transistor logic monolithically on the same wafer, so that we can build demonstrators at scale and establish if it is indeed a viable technology.”
Smart Devices, Smart Manufacturing: Pegatron Taps AI, Digital Twins
Today, Pegatron uses Cambrian, an AI platform it built for automated inspection, deployed in most of its factories. It maintains hundreds of AI models, trained and running in production on NVIDIA GPUs. Pegatron’s system uses NVIDIA A100 Tensor Core GPUs to deploy AI models up to 50x faster than when it trained them on workstations, cutting weeks of work down to a few hours. Pegatron uses NVIDIA Triton Inference Server, open-source software that helps deploy, run and scale AI models across all types of processors, and frameworks.
Taking another step in smarter manufacturing, Pegatron is piloting NVIDIA Omniverse, a platform for developing digital twins “In my opinion, the greatest impact will come from building a full virtual factory so we can try out things like new ways to route products through the plant,” he said. “When you just build it out without a simulation first, your mistakes are very costly.”
Using artificial intelligence to control digital manufacturing
MIT researchers have now used artificial intelligence to streamline this procedure. They developed a machine-learning system that uses computer vision to watch the manufacturing process and then correct errors in how it handles the material in real-time. They used simulations to teach a neural network how to adjust printing parameters to minimize error, and then applied that controller to a real 3D printer. Their system printed objects more accurately than all the other 3D printing controllers they compared it to.
The work avoids the prohibitively expensive process of printing thousands or millions of real objects to train the neural network. And it could enable engineers to more easily incorporate novel materials into their prints, which could help them develop objects with special electrical or chemical properties. It could also help technicians make adjustments to the printing process on-the-fly if material or environmental conditions change unexpectedly.
Visual Anomaly Detection: Opportunities and Challenges
Clarifai is pleased to announce pre-GA product offering of PatchCore-based visual anomaly detection model, as part of our visual inspection solution package for manufacturing which also consists of various purpose-built visual detection and segmentation models, custom workflows and reference application templates.
Users only need a few hundred images of normal examples for training, and ~10 anomalous examples for each category for calibration & testing only, especially with more homogeneous background and more focused region-of-interest.
Simplifying the world of materials properties evaluation using AI
Mettler-Toledo, together with CSEM and ZHAW has developed AIWizard: An artificial intelligence (AI) option for their STARe software that will make it easier to interpret DSC curves for thermal analysis.
Currently, manufacturers have high expectations surrounding the performance of their materials. A sealing ring must not become brittle, a PET bottle cannot deform, and medications need to react within the body at exactly the right time. Across the material science domain, Mettler-Toledo’s dynamic Differential Scanning Calorimeter (DSC) has become an indispensable tool for many. Thermal analysis makes a valuable contribution from quality control to research and development of materials and chemical compounds.
Yokogawa and DOCOMO Successfully Conduct Test of Remote Control Technology Using 5G, Cloud, and AI
Yokogawa Electric Corporation and NTT DOCOMO, INC. announced today that they have conducted a proof-of-concept test (PoC) of a remote control technology for industrial processing. The PoC test involved the use in a cloud environment of an autonomous control AI, the Factorial Kernel Dynamic Policy Programming (FKDPP) algorithm developed by Yokogawa and the Nara Institute of Science and Technology, and a fifth-generation (5G) mobile communications network provided by DOCOMO. The test, which successfully controlled a simulated plant processing operation, demonstrated that 5G is suitable for the remote control of actual plant processes.
Decentralized learning and intelligent automation: the key to zero-touch networks?
Decentralized learning and the multi-armed bandit agent… It may sound like the sci-fi version of an old western. But could this dynamic duo hold the key to efficient distributed machine learning – a crucial factor in the realization of zero-touch automated mobile networks? Let’s find out.
Next-generation autonomous mobile networks will be complex ecosystems made up of a massive number of decentralized and intelligent network devices and nodes – network elements that may be both producing and consuming data simultaneously. If we are to realize our goal of fully automated zero-touch networks, new models of training artificial intelligence (AI) models need to be developed to accommodate these complex and diverse ecosystems.
What’s Cognitive Manufacturing? Why Should It Matter To You?
The whole complex ecosystem of industries requires integration of various data systems. It is not just the sensor data system that needs retrofication. As many systems are analogue, there exist multiple interfaces because of various proprietary and automation systems such as DCS, SCADA, Historian, and PLC systems. With multiple protocols a simplification of this ecosystem can be done by customisation, bringing data from all the heterogeneous processes to a big data platform, understanding the business processes and gaps, and applying the predictive and prescriptive analytics.
Ford Taps Non-IT Professionals to Broaden Its AI Expertise
Ford hopes that opening up AI development to a broader range of employees can significantly reduce the average time it takes to develop many applications, in some cases from months to weeks and even days.
Ford’s AI builders are working on an AI-optimization model that will help the company decide which vehicles should be shipped to which European countries so that car inventory is optimized to maximize sales, according to Ford. The model takes into account thousands of variables, including the carbon-dioxide emissions of each vehicle type, each countries’ emission standards, the amount of miles citizens in a particular country drive, as well as the adoption of electric vehicles and the size of vehicles preferred in each country. Ford said the number of variables being analyzed requires the use of AI, which is designed to handle large data sets.
AI in the Process Industry
When applying AI to difficult problems in plants, approaches differ depending on whether AI researchers can access useful information derived from similar problems. This article first discusses how to search and identify useful research and literature. If well established AI research is available, the next step is simply to choose an appropriate AI platform. If not, the most serious bottleneck for the problem-solving task arises: how to integrate plant domain knowledge and AI technology. This article presents a solution to the latter case. This solution enables plant engineers to make full use of AI geared for themselves, not for data scientists. AI-based control, which is one of the promising AI applications for plants and is expected to solve difficult problems in plants, is also discussed.
Real-World ML with Coral: Manufacturing
For over 3 years, Coral has been focused on enabling privacy-preserving Edge ML with low-power, high performance products. We’ve released many examples and projects designed to help you quickly accelerate ML for your specific needs. One of the most common requests we get after exploring the Coral models and projects is: How do we move to production?
- Worker Safety - Performs generic person detection (powered by COCO-trained SSDLite MobileDet) and then runs a simple algorithm to detect bounding box collisions to see if a person is in an unsafe region.
- Visual Inspection - Performs apple detection (using the same COCO-trained SSDLite MobileDet from Worker Safety) and then crops the frame to the detected apple and runs a retrained MobileNetV2 that classifies fresh vs rotten apples.
Simplify Deep Learning Systems with Optimized Machine Vision Lighting
Deep learning cannot compensate for or replace quality lighting. This experiment’s results would hold true over a wide variety of machine vision applications. Poor lighting configurations will result in poor feature extraction and increased defect detection confusion (false positives).
Several rigorous studies show that classification accuracy reduces with image quality distortions such as blur and noise. In general, while deep neural networks perform better than or on par with humans on quality images, a network’s performance is much lower than a human’s when using distorted images. Lighting improves input data, which greatly increases the ability of deep neural network systems to compare and classify images for machine vision applications. Smart lighting — geometry, pattern, wavelength, filters, and more — will continue to drive and produce the best results for machine vision applications with traditional or deep learning systems.
Scientists Set to Use Social Media AI Technology to Optimize Parts for 3D Printing
“My idea was that a material’s structure is no different than a 3D image,” he explains. “It makes sense that the 3D version of this neural network will do a good job of recognizing the structure’s properties — just like a neural network learns that an image is a cat or something else.”
To see if his idea would work, Messner designed a defined 3D geometry and used conventional physics-based simulations to create a set of two million data points. Each of the data points linked his geometry to ‘desired’ values of density and stiffness. Then, he fed the data points into a neural network and trained it to look for the desired properties.
Finally, Messner used a genetic algorithm – an iterative, optimization-based class of AI – together with the trained neural network to determine the structure that would result in the properties he sought. Impressively, his AI approach found the correct structure 2,760x faster than the conventional physics simulation.
If AI Is So Awesome, Why Aren’t You Using It?
With all these universal applications and clearly understood benefits, the writing appears to be on the wall: AI is the wave of the future, and if you are not using or planning on using AI soon, you will be history! Software, platforms, and technologies are already out there, yet adoption appears to be slow. Financial justification and benefits analysis seem to be no-brainers, yet no one is out rushing to make improvements. Why is that?
Trash to Cash: Recyclers Tap Startup with World’s Largest Recycling Network to Freshen Up Business Prospects
People worldwide produce 2 billion tons of waste a year, with 37 percent going to landfill, according to the World Bank.
“Sorting by hand on conveyor belts is dirty and dangerous, and the whole place smells like rotting food. People in the recycling industry told me that robots were absolutely needed,” said Horowitz, the company’s CEO.
His startup, AMP Robotics, can double sorting output and increase purity for bales of materials. It can also sort municipal waste, electronic waste, and construction and demolition materials.
Tilling AI: Startup Digs into Autonomous Electric Tractors for Organics
Ztractor offers tractors that can be configured to work on 135 different types of crops. They rely on the NVIDIA Jetson edge AI platform for computer vision tasks to help farms improve plant conditions, increase crop yields and achieve higher efficiency.
Toward Generalized Sim-to-Real Transfer for Robot Learning
A limitation for their use in sim-to-real transfer, however, is that because GANs translate images at the pixel-level, multi-pixel features or structures that are necessary for robot task learning may be arbitrarily modified or even removed.
To address the above limitation, and in collaboration with the Everyday Robot Project at X, we introduce two works, RL-CycleGAN and RetinaGAN, that train GANs with robot-specific consistencies — so that they do not arbitrarily modify visual features that are specifically necessary for robot task learning — and thus bridge the visual discrepancy between sim and real.
The realities of developing embedded neural networks
With any embedded software destined for deployment in volume production, an enormous amount of effort goes into the code once the implementation of its core functionality has been completed and verified. This optimization phase is all about minimizing memory, CPU and other resources needed so that as much as possible of the software functionality is preserved, while the resources needed to execute it are reduced to the absolute minimum possible.
This process of creating embedded software from lab-based algorithms enables production engineers to cost-engineer software functionality into a mass-production ready form, requiring far cheaper, less capable chips and hardware than the massive compute datacenter used to develop it. However, it usually requires the functionality to be frozen from the beginning, with code modifications only done to improve the way the algorithms themselves are executed. For most software, that is fine: indeed, it enables a rigorous verification methodology to be used to ensure the embedding process retains all the functionality needed.
However, when embedding NN-based AI algorithms, that can be a major problem. Why? Because by freezing the functionality from the beginning, you are removing one of the main ways in which the execution can be optimized.
AI Vision for Monitoring Applications in Manufacturing and Industrial Environments
In traditional industrial and manufacturing environments, monitoring worker safety, enhancing operator efficiency, and improving quality assurance were physical tasks. Today, AI-enabled machine vision technologies replace many of these inefficient, labor-intensive operations for greater reliability, safety, and efficiency. This article explores how, by deploying AI smart cameras, further performance improvements are possible since the data used to empower AI machine vision comes from the camera itself.
FPGA comes back into its own as edge computing and AI catch fire
The niche of edge computing burdens devices with the need for extremely low power operation, tight form factors, agility in the face of changing data sets, and the ability to evolve with changing AI capabilities via remote upgradeability — all at a reasonable price point. This is, in fact, the natural domain of the FPGA with an inherent excellence in accelerating compute-intensive tasks in a flexible, hardware-customizable platform. However, much of the available off-the-shelf FPGAs are geared toward data center applications in which power and cost profiles justify the bloat in FPGA technologies.
Tools Move up the Value Chain to Take the Mystery Out of Vision AI
Intel DevCloud for the Edge and Edge Impulse offer cloud-based platforms that take most of the pain points away with easy access to the latest tools and software. While Xilinx and others have started offering complete systems-on-module with production-ready applications that can be deployed with tools at a higher level of abstraction, removing the need for some of the more specialist skills.
How the USPS Is Finding Lost Packages More Quickly Using AI Technology from Nvidia
In one of its latest technology innovations, the USPS got AI help from Nvidia to fix a problem that has long confounded existing processes – how to better track packages that get lost within the USPS system so they can be found in hours instead of in several days. In the past, it took eight to 10 people several days to locate and recover lost packages within USPS facilities. Now it is done by one or two people in a couple hours using AI.
Influence estimation for generative adversarial networks
Expanding applications [1, 2] of generative adversarial networks (GANs) makes improving the generative performance of models increasingly crucial. An effective approach to improve machine learning models is to identify training instances that “harm” the model’s performance. Recent studies [3, 4] replaced traditional manual screening of a dataset with “influence estimation.” They evaluated the harmfulness of a training instance based on how the performance is expected to change when the instance is removed from the dataset. An example of a harmful instance is a wrongly labeled instance (e.g., a “dog” image labeled as a “cat”). Influence estimation judges this “cat labeled dog image” as a harmful instance when the removal of “cat labeled dog image” is predicted to improve the performance (Figure 1)
John Deere and Audi Apply Intel’s AI Technology
Identifying defects in welds is a common quality control process in manufacturing. To make these inspections more accurate, John Deere is applying computer vision, coupled with Intel’s AI technology, to automatically spot common defects in the automated welding process used in its manufacturing facilities.
At Audi, automated welding applications range from spot welding to riveting. The widespread automation in Audi factories is part of the company’s goal of creating Industrie 4.0-level smart factories. A key aspect of this goal involves Audi’s recognition that creating customized hardware and software to handle individual use cases is not preferrable. Instead, the company focuses on developing scalable and flexible platforms that allow them to more broadly apply advanced digital capabilities such as data analytics, machine learning, and edge computing.
Robotic Flexibility: How Today’s Autonomous Systems Can Be Adapted to Support Changing Operational Needs
While robots are ideally suited to repetitive tasks, until now they lacked the intelligence to identify and handle tens of thousands of constantly changing products in a typical dynamic warehouse operation. That made applying robots to picking applications somewhat limited. Therefore, when German electrical supply wholesaler Obeta sought to install a new automated storage system from MHI member KNAPP in its new Berlin warehouse as a means to address a regional labor shortage made worse by COVID-19, the company specified a robotic picking system powered by onboard artificial intelligence (AI).
“The Covariant Brain is a universal AI that allows robots to see, reason and act in the world around them, completing tasks too complex and varied for traditional programmed robots. Covariant’s software enables Obeta’s Pick-It-Easy Robot to adapt to new tasks on its own through trial and error, so it can handle almost any object,” explained Peter Chen, co-founder and CEO of MHI member Covariant.ai.
Ford's Ever-Smarter Robots Are Speeding Up the Assembly Line
At a Ford Transmission Plant in Livonia, Michigan, the station where robots help assemble torque converters now includes a system that uses AI to learn from previous attempts how to wiggle the pieces into place most efficiently. Inside a large safety cage, robot arms wheel around grasping circular pieces of metal, each about the diameter of a dinner plate, from a conveyor and slot them together.
The technology allows this part of the assembly line to run 15 percent faster, a significant improvement in automotive manufacturing where thin profit margins depend heavily on manufacturing efficiencies.
Machine learning optimizes real-time inspection of instant noodle packaging
During the production process there are various factors that can potentially lead to the seasoning sachets slipping between two noodle blocks and being cut open by the cutting machine or being packed separately in two packets side by side. Such defective products would result in consumer complaints and damage to the company’s reputation, for which reason delivery of such products to dealers should be reduced as far as possible. Since the machine type upgraded by Tianjin FengYu already produced with a very low error rate before, another aspect of quality control is critical: It must be ensured that only the defective and not the defect-free products are reliably sorted out.
Multi-Task Robotic Reinforcement Learning at Scale
For general-purpose robots to be most useful, they would need to be able to perform a range of tasks, such as cleaning, maintenance and delivery. But training even a single task (e.g., grasping) using offline reinforcement learning (RL), a trial and error learning method where the agent uses training previously collected data, can take thousands of robot-hours, in addition to the significant engineering needed to enable autonomous operation of a large-scale robotic system. Thus, the computational costs of building general-purpose everyday robots using current robot learning methods becomes prohibitive as the number of tasks grows.
Intelligent edge management: why AI and ML are key players
What will the future of network edge management look like? We explain how artificial intelligence and machine learning technologies are crucial for intelligent edge computing and the management of future-proof networks. What’s required, and what are the building blocks needed to make it happen?
Intel Accelerates AI for Industrial Applications
The human eye can correct for different lighting conditions easily. However, images collected by camera can naturally vary in intensity and contrast if background lighting varies as well. We’ve seen scale challenges observed by factories trying to deploy AI for defect detection based on the exact same hardware, software and algorithm deployed on different machines on the factory floor. Sometimes it took months for factory managers and data scientists to find out why they were getting great results on one machine with high accuracy, low false positive and false negative rates, while on the next machine over the AI application would crash.
Tractor Maker John Deere Using AI on Assembly Lines to Discover and Fix Hidden Defective Welds
John Deere performs gas metal arc welding at 52 factories where its machines are built around the world, and it has proven difficult to find defects in automated welds using manual inspections, according to the company.
That’s where the successful pilot program between Intel and John Deere has been making a difference, using AI and computer vision from Intel to “see” welding issues and get things back on track to keep John Deere’s pilot assembly line humming along.
Amazon’s robot arms break ground in safety, technology
Robin, one of the most complex stationary robot arm systems Amazon has ever built, brings many core technologies to new levels and acts as a glimpse into the possibilities of combining vision, package manipulation and machine learning, said Will Harris, principal product manager of the Robin program.
Those technologies can be seen when Robin goes to work. As soft mailers and boxes move down the conveyor line, Robin must break the jumble down into individual items. This is called image segmentation. People do it automatically, but for a long time, robots only saw a solid blob of pixels.
AI In Inspection, Metrology, And Test
“The human eye can see things that no amount of machine learning can,” said Subodh Kulkarni, CEO of CyberOptics. “That’s where some of the sophistication is starting to happen now. Our current systems use a primitive kind of AI technology. Once you look at the image, you can see a problem. And our AI machine doesn’t see that. But then you go to the deep learning kind of algorithms, where you have very serious Ph.D.-level people programming one algorithm for a week, and they can detect all those things. But it takes them a week to program those things, which today is not practical.”
That’s beginning to change. “We’re seeing faster deep-learning algorithms that can be more easily programmed,” Kulkarni said. “But the defects also are getting harder to catch by a machine, so there is still a gap. The biggest bang for the buck is not going to come from improving cameras or projectors or any of the equipment that we use to generate optical images. It’s going to be interpreting optical images.”
Harvesting AI: Startup’s Weed Recognition for Herbicides Grows Yield for Farmers
In 2016, the former dorm-mates at École Nationale Supérieure d’Arts et Métiers, in Paris, founded Bilberry. The company today develops weed recognition powered by the NVIDIA Jetson edge AI platform for precision application of herbicides at corn and wheat farms, offering as much as a 92 percent reduction in herbicide usage.
Driven by advances in AI and pressures on farmers to reduce their use of herbicides, weed recognition is starting to see its day in the sun.
AI tool locates and classifies defects in wind turbine blades
Using image enhancement, augmentation methods and the Mask R-CNN deep learning algorithm, the system analyses images, highlights defect areas and labels them.
After developing the system, the researchers tested it by inputting 223 new images. The proposed tool is said to have achieved around 85 per cent test accuracy for the task of recognising and classifying wind turbine blade defects.
Adversarial training reduces safety of neural networks in robots
A more fundamental problem, also confirmed by Lechner and his coauthors, is the lack of causality in machine learning systems. As long as neural networks focus on learning superficial statistical patterns in data, they will remain vulnerable to different forms of adversarial attacks. Learning causal representations might be the key to protecting neural networks against adversarial attacks. But learning causal representations itself is a major challenge and scientists are still trying to figure out how to solve it.
Using tactile-based reinforcement learning for insertion tasks
A paper entitled “Tactile-RL for Insertion: Generalization to Objects of Unknown Geometry” was submitted by MERL and MIT researchers to the IEEE International Conference on Robotics and Automation (ICRA) in which reinforcement learning was used to enable a robot arm equipped with a parallel jaw gripper having tactile sensing arrays on both fingers to insert differently shaped novel objects into a corresponding hole with an overall average success rate of 85% with 3-4 tries.
Improving advanced manufacturing practices through AI's Bayesian network
With experience, we learn awareness of events and conditions in our plant environment. As our experience matures, we learn the possibility of a given set of events and conditions resulting in certain outcomes. Computational models can perform the same service by capturing events and conditions, then calculating the probability of certain consequences. If the probability of an anticipated outcome is unacceptable, our computers can inform us of a condition needing attention or address the situation themselves. This, along with collecting meaningful volumes of relevant data, is the core of AI.
One mathematical model employed in AI is the Bayesian network (BN), which is a graph that defines the relationships between conditions or events and their possible consequences. The conditions or events are random variables that are identified on a BN as a node.
Using AI to Find Essential Battery Materials
KoBold’s AI-driven approach begins with its data platform, which stores all available forms of information about a particular area, including soil samples, satellite-based hyperspectral imaging, and century-old handwritten drilling reports. The company then applies machine learning methods to make predictions about the location of compositional anomalies—that is, unusually high concentrations of ore bodies in the Earth’s subsurface.
Evolutionary Algorithms: How Natural Selection Beats Human Design
An evolutionary algorithm, which is a subset of evolutionary computation, can be defined as a “population-based metaheuristic optimization algorithm.” These nature-inspired algorithms evolve populations of experimental solutions through numerous generations by using the basic principles of evolutionary biology such as reproduction, mutation, recombination, and selection.
Introducing Amazon SageMaker Reinforcement Learning Components for open-source Kubeflow pipelines
Woodside Energy uses AWS RoboMaker with Amazon SageMaker Kubeflow operators to train, tune, and deploy reinforcement learning agents to their robots to perform manipulation tasks that are repetitive or dangerous.
Leveraging AI and Statistical Methods to Improve Flame Spray Pyrolysis
Flame spray pyrolysis has long been used to make small particles that can be used as paint pigments. Now, researchers at Argonne National Laboratory are refining the process to make smaller, nano-sized particles of various materials that can make nano-powders for low-cobalt battery cathodes, solid state electrolytes and platinum/titanium dioxide catalysts for turning biomass into fuel.
Way beyond AlphaZero: Berkeley and Google work shows robotics may be the deepest machine learning of all
With no well-specified rewards and state transitions that take place in a myriad of ways, training a robot via reinforcement learning represents perhaps the most complex arena for machine learning.
Evolution of control systems with artificial intelligence
Control systems have continuously evolved over decades, and artificial intelligence (AI) technologies are helping advance the next generation of some control systems.
The proportional-integral-derivative (PID) controller can be interpreted as a layering of capabilities: the proportional term points toward the signal, the integral term homes in on the setpoint and the derivative term can minimize overshoot.
Although the controls ecosystem may present a complex web of interrelated technologies, it can also be simplified by viewing it as ever-evolving branches of a family tree. Each control system technology offers its own characteristics not available in prior technologies. For example, feed forward improves PID control by predicting controller output, and then uses the predictions to separate disturbance errors from noise occurrences. Model predictive control (MPC) adds further capabilities to this by layering predictions of future control action results and controlling multiple correlated inputs and outputs. The latest evolution of control strategies is the adoption of AI technologies to develop industrial controls.
Rearranging the Visual World
Transporter Nets use a novel approach to 3D spatial understanding that avoids reliance on object-centric representations, making them general for vision-based manipulation but far more sample efficient than benchmarked end-to-end alternatives. As a consequence, they are fast and practical to train on real robots. We are also releasing an accompanying open-source implementation of Transporter Nets together with Ravens, our new simulated benchmark suite of ten vision-based manipulation tasks.
Artificial Intelligence: Driving Digital Innovation and Industry 4.0
Intelligent AI solutions can analyze high volumes of data generated by a factory to identify trends and patterns which can then be used to make manufacturing processes more efficient and reduce their energy consumption. Employing Digital Twin-enabled representations of a product and the associated process, AI is able to recognize whether the workpiece being manufactured meets quality requirements. This is how plants are constantly adapting to new circumstances and undergoing optimization with no need for operator input. New technologies are emerging in this application area, such as Reinforcement Learning – a topic that has not been deployed on a broad scale up to now. It can be used to automatically ascertain correlations between production parameters, product quality and process performance by learning through ‘trial-and-error’ – and thereby dynamically tuning the parameter values to optimize the overall process.
Edge-Inference Architectures Proliferate
What makes one AI system better than another depends on a lot of different factors, including some that aren’t entirely clear.
The new offerings exhibit a wide range of structure, technology, and optimization goals. All must be gentle on power, but some target wired devices while others target battery-powered devices, giving different power/performance targets. While no single architecture is expected to solve every problem, the industry is in a phase of proliferation, not consolidation. It will be a while before the dust settles on the preferred architectures.
Pushing The Frontiers Of Manufacturing AI At Seagate
Big data, analytics and AI are widely used in industries like financial services and e-commerce, but are less likely to be found in manufacturing companies. With some exceptions like predictive maintenance, few manufacturing firms have marshaled the amounts of data and analytical talent to aggressively apply analytics and AI to key processes.
Seagate Technology, an over $10B manufacturer of data storage and management solutions, is a prominent counter-example to this trend. It has massive amounts of sensor data in its factories and has been using it extensively over the last five years to ensure and improve the quality and efficiency of its manufacturing processes.
Stanford researchers propose AI that figures out how to use real-world objects
One longstanding goal of AI research is to allow robots to meaningfully interact with real-world environments. In a recent paper, researchers at Stanford and Facebook took a step toward this by extracting information related to actions like pushing or pulling objects with movable parts and using it to train an AI model. For example, given a drawer, their model can predict that applying a pulling force on the handle would open the drawer.
Advanced Technologies Adoption and Use by U.S. Firms: Evidence from the Annual Business Survey
While robots are usually singled out as a key technology in studies of automation, the overall diffusion of robotics use and testing is very low across firms in the U.S. The use rate is only 1.3% and the testing rate is 0.3%. These levels correspond relatively closely with patterns found in the robotics expenditure question in the 2018 ASM. Robots are primarily concentrated in large, manufacturing firms. The distribution of robots among firms is highly skewed, and the skewness in favor of larger firms can have a disproportionate effect on the economy that is otherwise not obvious from the relatively low overall diffusion rate of robots. The least-used technologies are RFID (1.1%), Augmented Reality (0.8%), and Automated Vehicles (0.8%). Looking at the pairwise adoption of these technologies in Table 14, we find that use of Machine Learning and Machine Vision are most coincident. We find that use of Automated Guided Vehicles is closely associated with use of Augmented Reality, RFID, and Machine Vision.
The Amazing Ways The Ford Motor Company Uses Artificial Intelligence And Machine Learning
The Ford research lab has conducted research on computational intelligence for more than 20 years. About 15 years ago the company introduced an innovative misfire detection system—one of the first large-scale industrial applications of neural networks. Ford uses artificial intelligence to automate quality assurance as well; AI can detect wrinkles in car seats. In addition, neural networks help support Ford’s supply chain through inventory and resource management.