Home Security Revolutionizing Engineering: AI’s New Role in Solving Complex Equations Faster Than Supercomputers

Revolutionizing Engineering: AI’s New Role in Solving Complex Equations Faster Than Supercomputers

by


Supercomputers are known for their high performance, which allows them to solve complex computational problems. The fastest computers in the world, these machines can process massive datasets and perform complex calculations at rapid speeds, capable of solving as many as one quintillion calculations per second.

Interestingly, just this week, the tech giant Google unveiled its next-gen chip called ‘Willow,’ which operates using superconducting qubits and can solve a complex mathematical problem in just five minutes while reducing errors exponentially. 

Despite its impressive performance, the quantum chip is nowhere close to breaking modern cryptography.

Amidst all this, a new artificial intelligence (AI) comes with the capability to solve complex engineering problems even faster than supercomputers. The new technological solution comes from Johns Hopkins researchers, who can be game-changers in the engineering space.

The New Era of AI 

After being a hot topic for years, AI has finally started to be utilized meaningfully across key industries. Its immense potential to enhance efficiency and productivity has its market growing beyond $184 billion this year and is projected to increase revenue by more than $15 trillion by the end of this decade. 

A recent report found that 68% of organizations are either actively utilizing Gen AI or have developed roadmaps following successful pilot implementations. 

As AI continues to transform various industries, especially the engineering landscape, people are now facing the challenge of becoming outdated. Estimates suggest that over the next decade, up to 40% of engineering tasks may get automated.

To understand AI’s impact on the world, we must first understand that AI is simply a technology that enables machines and computers to simulate human thinking, learning, comprehension, problem-solving, decision-making, and creativity. 

Underneath AI is machine learning, which involves training an algorithm to create models that utilize data to make decisions and predictions. 

There are different kinds of machine learning algorithms or techniques, with artificial neural networks being one of the most popular types. These networks are modeled after the structure and function of the human brain. 

Deep learning, a subset of machine learning, uses multilayered neural networks that are even more effective at simulating the complex decision-making power of the human brain. These networks learn from data and are used to solve different problems, ranging from image and speech recognition to processing natural language. 

Deep Learning is completely changing the way machines interact with complex data, with the capability to surpass human-level performance while achieving high accuracy.

Using AI to Address Complex Problems 

AI offers numerous benefits, such as automation of repetitive tasks, fewer human errors, round-the-clock availability, and enhanced decision-making, which has led to its application in all kinds of businesses across industries.

The technology’s ability to efficiently analyze vast amounts of data, identify patterns that may have been missed by humans, and finally execute quick calculations makes AI a great tool for solving complex problems. When dealing with large datasets and intricate decision-making scenarios that would be time-consuming or impossible for humans to address alone, AI can be immensely helpful.

Hence, a growing focus on utilizing AI to solve intricate issues. A year ago, researchers from MIT and ETH Zurich used machine learning to solve the optimization problem of efficiently routing holiday packages for companies like FedEx.

These companies utilize software called mixed-integer linear programming (MILP) solver that splits the problem into smaller pieces and uses generic algorithms to find the best solution, which could take hours and even days.

Here, the key part that slows the entire process is that MILP solvers have too many potential solutions. The researchers used a filtering mechanism to simplify this step, which sped up MILP solvers between 30-70% without affecting the accuracy. For this, the technique relied on the principle of diminishing marginal returns and then used machine learning, trained with a dataset specific to the problem, to find the optimal solution from the reduced number of options.

Just earlier this month, a London-based startup, PhysicsX, introduced a large geometry model called LGM-Aero for aerospace engineering. The geometry and physics model is expected to help bring notable reductions in aircraft concept development time. The company has made a reference application (‘Ai.rplane’) built on LGM-Aero publicly accessible to showcase the capabilities of its model in generating aircraft designs and predicting physics related to aircraft performance. 

The model is trained on Amazon Web Services (AWS) cloud compute using over 25 million different shapes, representing over 10 billion vertices. Its training data also includes a collection of computational fluid dynamics (CFD) and finite element analysis (FEA) simulations generated in collaboration with Siemens. 

Much like how LLMs understand text, the LGM has a vast knowledge of the shapes and structures important to aerospace engineering and, as such, “can optimize across multiple types of physics in seconds, many orders of magnitude faster than numerical simulation, and at the same level of accuracy,” said CEO Jacomo Corbo.

This year, OpenAI, the Microsoft-backed AI research company behind ChatGPT, also unveiled its latest models, the o1-preview and o1-mini, claiming a significant leap in the reasoning capabilities of large language models (LLMs). 

The model comes with the ability to use “chain-of-thought reasoning,” similar to what humans do when solving a problem, which involves dividing complex things into small, manageable tasks. The application of human-like reasoning in LLMs has been previously observed by Google Research and others as well. 

A New AI Model to Solve PDEs

As the usage and popularity of AI continue to grow, so do its capabilities, with researchers and companies working on making the technology better and more accurate. 

The latest AI framework from Johns Hopkins researchers is taking a generic approach to predict solutions to time-consuming and prevalent math equations. The partial differential equations (PDEs) are a pervasive task in the field of engineering and medicine research. 

However, the computational costs involved in solving these equations can be prohibitively high. Also, solving these massive math problems generally requires supercomputers, but not anymore. 

The new AI framework enables even personal computers to tackle these partial differential equations that scientists use to translate real-world processes or systems into mathematical representations of how objects change over time and space.

This isn’t the first time an AI model has been proposed to solve PDEs; in fact, its idea was first shared a few decades ago. In the emerging field of scientific machine learning, solving partial differential equations with neural networks has been gaining a lot of attention this past decade thanks to all the advances in computational ability for training deep neural networks. 

Despite the success of the neural operator, which uses AI to learn the PDE solution operator, the latest research noted that computational bottlenecks continue to exist when performing tasks in optimization and prognostication. This is because of the neural operators’ inability to evaluate PDE solutions that are dependent on geometry.

Currently, the majority of neural operator frameworks, as the study noted, are developed on a domain with fixed boundaries. Also, having shape variations requires retraining the neural network. 

So, with the aim of addressing these computational challenges, the researchers proposed DIMON — Diffeomorphic Mapping Operator Learning. For this, they combined neural operators with diffeomorphic mappings between domains and shapes. 

The model eliminates the need to recalculate grids with every form change. This way, DIMON can speed up simulations and optimize designs by predicting just how physical elements such as motion, stress, and heat behave across different shapes instead of breaking down complex shapes into small elements.

Generally, solving these equations involves breaking complex shapes, such as human organs or airplane wings, into grids or meshes made of small elements. The problem is then solved on each simple piece before it is recombined. 

However, if these shapes change due to a crash or deformation, the grids must be updated. This means the solutions need to be recalculated, too, which makes the entire computation process not only expensive but also slow. 

DIMON here employs AI to understand the way physical systems work with various shapes. So, researchers don’t have to divide shapes into grids and solve equations again and again; rather, the AI makes use of the patterns it has learned to predict just how different factors will behave, making it more efficient and faster to model shape-specific scenarios and optimize designs.

According to co-lead Natalia Trayanova, a biomedical engineering and medicine professor at Johns Hopkins University:

“While the motivation to develop it came from our own work, this is a solution that we think will generally have a massive impact on various fields of engineering because it’s very generic and scalable.”

A Tipping-Point for Engineering Designs

The new AI framework provides an approach that allows for fast prediction of PDE solutions on multiple domains. Moreover, it facilitates many downstream applications using AI. 

Talking about the model’s capabilities, Trayanova noted that DIMON can basically work on any problem in any field of science or engineering to solve PDE on multiple geometries. 

This includes crash testing, analyzing how spacecraft respond to extreme environments, assessing how bridges resist stress, studying how fluids propagate through different geometries, doing orthopedics research, and addressing other complex problems where materials and shapes change. The modeling of all these scenarios can now be made a lot faster thanks to the new AI framework.

To demonstrate the applicability of the new model in solving other kinds of engineering problems, the team tested DIMON on more than 1,000 heart “digital twins.” These digital twins are highly detailed computer models of the hearts of real patients.

It is by solving partial differential equations that cardiac arrhythmia is studied. The condition causes irregular beating in the heart due to an electrical impulse misbehavior. The digital twins of hearts enable researchers to determine if patients may get this condition, which is often fatal, and then recommend ways to treat it. 

The new AI framework was found successful in predicting how electrical signals transmit through each unique heart shape with high accuracy without even needing to perform expensive numerical simulations.

Trayanova, the director of the Johns Hopkins Alliance for Cardiovascular Diagnostic and Treatment Innovation, applies data-driven approaches, computational modeling, and innovations in cardiac imaging to diagnose and treat cardiovascular disease. They are constantly introducing novel technology into the clinic.  

However, she noted that their solutions are still too slow as it takes them about a week to scan the heart of a patient and solve PDE to predict if the patient is at high risk for sudden cardiac death and then provide the best treatment plan.

But this is seeing a monumental shift with their latest model.

“With this new AI approach, the speed at which we can have a solution is unbelievable.”

– Trayanova

The time it takes to make the prediction of a heart digital twin has been reduced from several hours to just half a minute (30 seconds). This isn’t even all; calculating this doesn’t even require a supercomputer. Rather, it’s all done on a desktop computer, which Trayanova said would allow them “to make it part of the daily clinical workflow.”

The versatility of the technology makes it perfect for situations where solving partial differential equations on new shapes is repeatedly needed. 

“For each problem, DIMON first solves the partial differential equations on a single shape and then maps the solution to multiple new shapes. This shape-shifting ability highlights its tremendous versatility. We are very excited to put it to work on many problems as well as to provide it to the broader community to accelerate their engineering design solutions.”

– Minglang Yin, a postdoc fellow at Johns Hopkins Biomedical Engineering, who developed the platform

Companies Advancing AI

Now, let’s take a look at companies that are helping take the technological revolution of AI to new heights. 

1. NVIDIA Corporation (NVDA -2.25%)

A leading provider of GPUs, Nvidia is the world’s second-largest company, with a market cap of $3.28 trillion. At the time of writing, its shares are trading at $133.91, up a whopping 171.9% year-to-date (YTD) while having an EPS (TTM) of 2.54, a P/E (TTM) of 52.90, and a ROE (TTM) of 127.21% while paying a dividend yield of 0.03%.

NVIDIA Corporation (NVDA -2.25%)

The company’s hardware and software solutions are crucial for deep learning applications and engineering simulations, playing an important role in advancing the AI revolution.

Driven by AI mania, Nvidia reported a revenue of over $35 billion for Q3 ended October 27, 2024, which is an increase of 17% from the previous quarter and a massive 94% increase from a year ago.

“The age of AI is in full steam, propelling a global shift to NVIDIA computing,” said CEO and founder Jensen Huang, who further noted that AI is not only transforming companies and industries but also countries that are “awakened to the importance of developing their national AI and infrastructure.”

2. Microsoft Corporation (MSFT -0.51%)  

Having a market cap of $3.32 trillion puts Microsoft among the world’s top three companies by market cap. Its shares, as of writing, are trading at $447.24, representing an increase of almost 19% YTD. This puts the company’s EPS (TTM) at 12.11, P/E (TTM) at 36.92, and ROE (TTM) at 35.60%. The dividend yield paid by Microsoft, meanwhile, is 0.74%.

Microsoft Corporation (MSFT -0.51%)

Microsoft’s biggest involvement in AI is through OpenAI, in which it has poured more than $13 billion. In addition to its partnership with OpenAI, which was recently valued at $150bn, Microsoft is also heavily invested in AI research, cloud solutions, and applications for engineering and scientific computing.

For the period between July and September, the company reported $65.6 billion in sales, an increase of 16% from a year earlier, while its profits rose 11% to $24.7 billion. This growth was propelled by demand continuing “to be higher than our available capacity,” as per Microsoft’s finance chief.

3. ANSYS, Inc. (ANSS -0.39%)

This one specializes in engineering simulation software for solving complex problems. The company, whose services are used by students, researchers, designers, and engineers, is also increasingly integrating AI to enhance efficiency.

With a market cap of $29.75 billion, the shares of Ansys are currently trading at $339.51, down 6.24% this year. This has the company’s EPS (TTM) of 6.47, a P/E (TTM) of 52.55, and an ROE (TTM) of 10.48%. 

ANSYS, Inc. (ANSS -0.39%)

For 3Q24, Ansys reported $601.9 million in revenue, up 31% from the third quarter last year, while its annual contract value (ACV) was $540.5 million. The GAAP operating profit margin was reported to be 26.8%, while the non-GAAP operating profit margin was 45.8%. Operating cash flows for the period came in at $174.2 million, while its deferred revenue and backlog were $1,463.8 million.

Conclusion 

AI is advancing at a rapid pace, and the introduction of new AI frameworks like DIMON marks a revolutionary step in solving complex engineering problems while drastically reducing the time and computational costs associated with them. This way, the breakthrough not only accelerates engineering design processes but also expands AI’s application across diverse fields.

As researchers, along with companies like Nvidia, continue to make big discoveries, build powerful models, and advance technologies, the potential for integrating AI into everyday workflows grows exponentially, which points to a new era where AI drives unprecedented efficiency and innovation!

Click here to learn all about investing in artificial intelligence.



Source link

Related Articles

xxxanti beeztube.mobi hot sexy mp4 menyoujan hentaitgp.net jason voorhees hentai indian soft core chupatube.net youjzz ez2 may 8 2023 pinoycinema.org ahensya ng pamahalaan pakistani chut ki chudai pimpmovs.com www xvedio dost ke papa zztube.mobi 300mbfilms.in صور مص الزب arabporna.net نهر العطش لمن تشعر بالحرمان movierulz plz.in bustyporntube.info how to make rangoli video 穂高ゆうき simozo.net 四十路五十路 ロシアav javvideos.net 君島みお 無修正 افلام سكس في المطبخ annarivas.net فيلم سكس قديم rashmi hot videos porncorn.info audiosexstories b grade latest nesaporn.pro high school girls sex videos real life cam eroebony.info painfull porn exbii adult pics teacherporntrends.com nepali school sex