AI Applications In Engineering

Explore top LinkedIn content from expert professionals.

  • View profile for Marily Nika, Ph.D
    Marily Nika, Ph.D Marily Nika, Ph.D is an Influencer

    Gen AI Product @ Google | AI builder & Educator | Get certified as an AI PM with my Bootcamp | O’Reilly Best Selling Author | Fortune 40u40 | aiproduct.com

    121,140 followers

    Introducing RICE-A, a prioritization framework for AI products. Traditional frameworks like RICE excel at helping teams evaluate feature ideas based on Reach, Impact, Confidence, and Effort. However, when it comes to AI products, the unique challenges of data collection, model training, and deployment require a nuanced approach. I see Product Managers sometimes including these challenges within ‘Effort’ but I don’t believe that this is the right approach... That’s why I am proposing RICE-A, an enhanced prioritization framework tailored specifically for AI-driven features. RICE-A will help product managers make data-informed decisions, balancing innovation with execution feasibility. ✨ What Is RICE-A? RICE-A builds on the RICE framework by introducing a fifth factor: AI Complexity (A). This additional layer captures the unique effort required by the AI lifecycle - to design, train, and deploy AI models, ensuring AI-specific challenges are weighted appropriately. ✨ The RICE-A Formula (look at the image) Each component evaluates a specific aspect of the feature's feasibility and potential: →Reach: What percentage of your target audience will benefit from this feature? →Impact: How significant is the impact for the target user? →Confidence: How certain are you about the accuracy of your assumptions and ability to deliver? →Effort: What is the engineering effort needed to implement the feature? →(the new part) AI Complexity (A): What are the data and computational demands for collecting the right dataset, training a robust model, and ensuring scalability? ✨ Why Add "AI Complexity"? AI features present unique challenges that aren't captured by traditional effort metrics. For example... -Data Challenges: Collecting, cleaning, and labeling high-quality datasets is often a monumental task. -Training Costs: Model training requires substantial computational resources, hyperparameter tuning, and infrastructure setup. -Deployment & Monitoring: AI systems demand post-deployment monitoring, retraining, and bias detection to ensure sustained performance. I'm expanding this more on the first link in the comments, I also included 11 AI Product Management jobs I would apply to if I were looking for anyone interested. <><><><><><><><><><><><><><><><> Follow Marily Nika, Ph.D for the #1 AI Product Management certification. Best way to support my work is if you like & share 🔄 my content.

  • View profile for Aishwarya Srinivasan
    Aishwarya Srinivasan Aishwarya Srinivasan is an Influencer
    599,136 followers

    If you’re an AI engineer/ or aspiring to be one, and looking to build strong, technical portfolio projects, this one’s for you. I’ve pulled together 5 practical project ideas that go beyond toy examples and give you real exposure to open-weight models, multimodal inputs, long-context reasoning, tool-use via MCP, and even on-device AI. Each project includes an open-source reference repo so you don’t have to start from scratch, you can fork, build, and iterate. 1️⃣ Autonomous Browser Agent Turn any website into an API. Give the agent a natural-language goal, it plans, acts in a real browser, and returns structured output. → Model: DeepSeekV3 via Fireworks AI Inference → Planner: LangGraph → Browser control: Playwright MCP server or browser-use → Optional memory: mem0 🔗 Repo: shubcodes/fireworksai-browseruse 2️⃣ 1M-token Codebase Analyst Load massive repos like PyTorch into a single 1M-token window and answer deep questions about architecture and logic, no brittle chunking. → Model: Llama 4 Maverick served via Fireworks AI (KV-cache paging) → Long-context tuning: EasyContext → Interface: Gradio or VS Code extension 🔗 Repos: adobe-research/NoLiMa, jzhang38/EasyContext 3️⃣ Multimodal Video-QA & Summariser Ingest long-form videos and output timeline-aligned summaries and Q&A. Combine visual frames with ASR transcripts for deep comprehension. → Model: MVU (ICLR ’25) or HunyuanVideo → Retrieval: LanceDB hybrid search → Serving: vLLM multimodal backend + FFmpeg 🔗 Repo: kahnchana/mvu 4️⃣ Alignment Lab (RLHF / DPO) Fine-tune a 7B open-weight model using preference data and evaluate its behavior with real alignment benchmarks. → Framework: OpenRLHF with Fireworks AI endpoints → Evaluation: RewardBench, trlX → Dataset: GPT-4o-generated preference pairs 🔗 Repo: OpenRLHF/OpenRLHF 5️⃣ Local-first Voice Assistant with Memory Build a privacy-first voice assistant that runs fully offline, remembers users, and syncs memory when online. → Model: Mobile-optimized Llama 3.2 with ExecuTorch or Ollama → ASR and TTS: Whisper.cpp + WhisperSpeech → Memory: mem0 via OpenMemory MCP 🔗 Repo: mem0ai/mem0 My two cents: → Don’t wait for the “perfect” starting point. Fork one of these repos, add a feature, refactor the flow, swap the model. That’s how you learn. → If you’re stuck starting from scratch, lean on these foundations and build iteratively. → You don’t need to be perfect at coding everything, you can pair up with tools like Cursor, or use coding copilots like Claude, or GitHub Copilot to break through blockers. → Prefer working on visible, end-to-end workflows. Even better if you can ship a demo or write a detailed blog post about what you learned. → If you’re not ready to build a full product, even contributing to an existing open-source agent or LLM inference repo is a great start. 〰️〰️〰️ Follow me (Aishwarya Srinivasan) for more AI insight and subscribe to my Substack to find more in-depth blogs and weekly updates in AI: https://lnkd.in/dpBNr6Jg

  • View profile for Guido De Croon
    7,077 followers

    Today, Science Robotics has published our work on the first drone performing fully #neuromorphic vision and control for autonomous flight! 🥳 Deep neural networks have led to amazing progress in Artificial Intelligence and promise to be a game-changer as well for autonomous robots 🤖. A major challenge is that the computing hardware for running deep neural networks can still be quite heavy and power consuming. This is particularly problematic for small robots like lightweight drones, for which most deep nets are currently out of reach. A new type of neuromorphic hardware draws inspiration from the efficiency of animal eyes 👁 and brains 🧠. Neuromorphic cameras do not record images at a fixed frame rate, but instead have the pixels track the brightness over time, sending a signal only when the brightness changes. These signals can now be sent to a neuromorphic processor, in which the neurons communicate with each other via binary spikes, simplifying calculations. The resulting asynchronous, sparse sensing and processing promises to be both quick and energy efficient! 🔋 In our article, we investigated how a spiking neural network (#SNN) can be trained and deployed on a neuromorphic processor for perceiving and controlling drone flight 🚁. Specifically, we split the network in two. First, we trained an SNN to transform the signals from a downward looking neuromorphic camera to estimates of the drone’s own motion. This network was trained on data coming from our drone itself, with self-supervised learning. Second, we used an artificial evolution 🦠🐒🚶♂️ to train another SNN for controlling a simulated drone. This network transformed the simulated drone’s motion into motor commands such as the drone’s orientation. We then merged the two SNNs 👩🏻🤝👩🏻 and deployed the resulting network on Intel Labs’ neuromorphic research chip "Loihi". The merged network immediately worked on the drone, successfully bridging the reality gap. Moreover, the results highlight the promises of neuromorphic sensing and processing: The network ran 10-64x faster 🏎💨 than a comparable network on a traditional embedded GPU and used 3x less energy. I want to first congratulate all co-authors at TU Delft | Aerospace Engineering: Federico Paredes Vallés, Jesse Hagenaars, Julien Dupeyroux, Stein Stroobants, and Yingfu Xu 🎉 Moreover, I would like to thank the Intel Labs' Neuromorphic Computing Lab and the Intel Neuromorphic Research Community (#INRC) for their support with Loihi (among others Mike Davies and Yulia Sandamirskaya). Finally, I would like to thank NWO (Dutch Research Council), the Air Force Office of Scientific Research (AFOSR) and Office of Naval Research Global (ONR Global) for funding this project. All relevant links can be found below. Delft University of Technology, Science Magazine #neuromorphic #spiking #SNN #spikingneuralnetworks #drones #AI #robotics #robot #opticalflow #control #realitygap

  • View profile for Luiza Jarovsky, PhD
    Luiza Jarovsky, PhD Luiza Jarovsky, PhD is an Influencer

    Co-founder of the AI, Tech & Privacy Academy (1,300+ participants), Author of Luiza’s Newsletter (87,000+ subscribers), Mother of 3

    121,405 followers

    🚨 Most people don't know it, but the EU aims to become a LEADER IN AI - not just in AI regulation. At the heart of its strategy are AI FACTORIES. Can it catch up with the U.S. and China? Big announcement today: First, what are AI Factories? According to the EU Commission: "AI Factories will bring together the key ingredients that are needed for success in AI: computing power, data, and talent. They will provide access to the massive computing power that start-ups, industry and researchers need to develop their AI models and systems. For example, European large language models or specialised vertical AI models focusing on specific sectors or domains." Today, the EU announced a second wave of AI factories. The six new AI factories will join the seven existing ones launched in December (see the map below with the countries involved). Here's the announcement: "Austria, Bulgaria, France, Germany, Poland, and Slovenia will host the newly selected AI Factories, supported by a combined national and EU investment of around €485 million. The factories will offer privileged access to AI startups and small-and-medium sized enterprises (SMEs), fostering growth and more effective scaling up. AI Factories are a core pillar of the Commission’s strategy for Europe to become a leader in AI, bringing together 17 Member States and two associated EuroHPC participating states. The infrastructure and services provided by AI Factories are essential for unlocking the full potential of the sector in Europe. Backed by the EU’s world-class network of supercomputers, these factories will bring together the key ingredients for AI innovation: computing power, data, and talent. This will enable AI companies, particularly SMEs and startups, as well as researchers, to enhance the training and development of large-scale, trustworthy and ethical AI models. As announced by President von der Leyen at the AI Action Summit in Paris, the Invest AI initiative aims to mobilise up to €200 billion of European investments in AI. This will include the deployment of several AI Gigafactories across Europe, which will be massive high-performance computing facilities designed to develop and train next-generation AI models and applications." #AI #AIGovernance #AIRegulation #AIFactories #AICompliance #EU

  • View profile for Joerg Theis
    5,323 followers

    I believe AI creates real value when it tackles hard, physical problems — the kind that live in factories, warehouses, and service tasks. Recently, I learned the attached from a plastics machine manufacturer and logistics provider struggling with unpredictable production schedules, warehouse congestion, and reactive maintenance routines. When a structured AI implementation approach was brought into the equation the following outcome was achieved 👇 🔹 Smart Production Planning – Machine learning models forecasted demand and optimized resin batch production, cutting material waste by 18%. 🔹 AI-Driven Warehouse Logistics – Intelligent slotting and routing algorithms boosted order fulfillment rates by 25%, reducing forklift travel time and idle inventory. 🔹 Predictive Maintenance for Service Teams – Sensor data and pattern recognition flagged early signs of machine wear, reducing unplanned downtime by 30%. The result wasn’t automation replacing people — it was augmentation empowering people. Operators, warehouse managers, and service engineers gained real-time insights to make faster, better decisions. 💡 Takeaway: AI success in industrial environments isn’t about technology first — it’s about aligning data, people, and process to create measurable operational impact. #AI #IndustrialServices #SmartManufacturing #WarehouseOptimization #PredictiveMaintenance #DigitalTransformation #OperationalExcellence

  • View profile for Dr. Barry Scannell
    Dr. Barry Scannell Dr. Barry Scannell is an Influencer

    AI Law & Policy | Partner in Leading Irish Law Firm William Fry | Member of Irish Government’s Artificial Intelligence Advisory Council | PhD in AI & Copyright | LinkedIn Top Voice in AI | Global Top 200 AI Leaders 2025

    56,762 followers

    Europe needs its own AI ‘Stargate’ initiative and we have the capability to create it, as demonstrated by the success of CERN. If we don’t - we will fall so far behind we may never be able to catch up and we will be forever reliant on other countries for our technology. This week’s announcement of the U.S. Stargate AI initiative, with its massive $500 billion investment into AI infrastructure, highlights the urgent need for Europe to step up. While the EU has been proactive in AI policy and regulation, it lacks a bold, centralised project that could position it as a true leader in AI innovation. However, CERN stands as proof that Europe can achieve scientific and technological excellence on a grand scale. CERN has grown into the world’s leading particle physics laboratory, responsible for groundbreaking discoveries such as the Higgs boson. Its flagship experiment, the Large Hadron Collider, is the most sophisticated particle accelerator ever built, requiring cooperation from 23 member states and contributions from countries around the world. CERN demonstrates Europe’s ability to pool financial and intellectual resources across national boundaries. A similar approach in AI could consolidate Europe’s fragmented AI ecosystem into a unified force capable of competing with the U.S. and China. Its infrastructure, processing petabytes of data daily, pushes the boundaries of high-performance computing and could serve as a model for AI supercomputing hubs across Europe. Beyond physics, CERN’s work has led to innovations in computing and data processing that have influenced AI research and development. It embodies Europe’s scientific independence and serves as an example of how strategic investments can ensure that Europe remains at the forefront of technological advancements. Europe is not standing still, and the European Commission has announced the creation of AI Factories across the continent. These centres are designed to enhance Europe’s AI capabilities by providing researchers, startups, and enterprises with access to high-performance computing resources. The AI Factories are a step in the right direction, offering the infrastructure to support advanced AI and ensuring that European companies have access to the computing power required. However, they are simply not enough. The initiative, while well-intentioned, lacks the scale, funding, and strategic ambition to match the efforts of the U.S. or China. Europe’s approach remains fragmented, with individual countries pursuing their own AI strategies without a unified, continent-wide vision. Europe cannot afford to treat AI as merely a regulatory challenge. It is an economic and strategic imperative. Europe must take inspiration from CERN’s model of collaboration with industry and extend it to AI. The EU’s biggest weakness is fragmentation. A centralised European AI organisation, modelled on CERN, could coordinate efforts across member states and provide a singular vision for AI leadership.

  • View profile for Ulrich Leidecker

    Chief Operating Officer at Phoenix Contact

    5,679 followers

    What if building automation became a driver of production efficiency? At our Phoenix Contact site in Bad Pyrmont, we’re exploring exactly that. During a recent visit, I met with Dr. Hannah Peter to discuss how we’re connecting facility management and manufacturing. The goal is smarter use of energy and resources. Our PLCnext Factory continuously collects data, which is analyzed by AI to provide infrastructure on demand. This leads to up to 50% lower operating costs. Over the past three years, we’ve seen measurable impact: ⬆️ 30% more productivity ⬇️ 30% less energy consumed 💶 Approximately 1.5 million euros saved annually 🌍 Around 200 tons of CO₂ avoided per year Facility systems, production, EV charging infrastructure, and a battery storage unit are all connected and largely powered by our own solar energy. We also collaborate locally, for example via the district heating network, to make use of existing resources. What we test and validate here is shared with customers and partners who are looking to digitize their own operations. This is sector coupling in practice. A step closer to the 1.5°C goal. Do we have all the answers? Not yet. But we’re learning fast and sharing what works. And here’s one more idea: What if we made these systems even more open and scalable with a control solution built specifically for building applications, based on PLCnext Technology?

  • View profile for Dr. Isil Berkun
    Dr. Isil Berkun Dr. Isil Berkun is an Influencer

    Applying AI for Industry Intelligence | Stanford LEAD Finalist | Founder of DigiFab AI | 300K+ Learners | Former Intel AI Engineer | Polymath

    18,815 followers

    Manufacturing teams: Stop thinking AI is "just for software". I just analyzed how Anthropic's teams actually use Claude across their organization, and the translation to industrial use cases is shocking. Traditional AI → Industrial AI: - Debugging Infrastructure → Sensor logs, MES system bugs, PLC issues - Unit Test Generation → Hardware test planning, QA protocols - Code Reviews → Legacy code in robotic arms, CNC controllers - Data Visualization → Production floor dashboards for operators - Documentation → ISO/FDA protocols, incident playbooks The real insight? Claude is becoming a cool teammate! :) Anthropic uses it across: → Engineering (code reviews, debugging) → Security (risk assessment, config reviews) → Operations (process optimization, SOPs) → Quality (test planning, validation) → Compliance (regulatory docs, audits) This is the future of smart factories. Not more siloed dashboards (please!), but AI teammates positioned across every role in your organization. 5 things manufacturing can steal (proudly) from Anthropic's playbook: 1️⃣ Use AI for edge case identification, not just automation 2️⃣ Replace documentation burnout with AI-first drafting 3️⃣ Help teams think faster, not just work faster 4️⃣ Deploy AI across ALL roles, not just IT 5️⃣ Build organizational memory, not just velocity The companies getting this right aren't waiting for "AI to be ready for manufacturing." They're realizing it already is. We just need to catch up. What's your biggest AI opportunity in manufacturing? 👇 Read more in my Substack post, link in the comments. #ManufacturingAI #IndustrialAI #SmartFactory #Claude #DigiFabAI

  • View profile for Kirsch Mackey

    I review tools and systems by how they feel — not just what they promise. Electrical Engineering Workflow | ECAD | AI Productivity

    13,222 followers

    I built an entire PCB from scratch in 35 minutes using AI. FLUX.AI COPILOT transformed my engineering workflow. Starting point: A student's basic block diagram End result: Complete schematic + 3D layout The AI asked intelligent questions: "Which USB serial IC - CH340G or FT232RL?" "Internal or external oscillator?" "Do you want an RC filter on your inputs?" Real engineering decisions while AI handled the grunt work. Technical breakdown: POWER SUPPLY • Generated optimal rail voltages • Added protection circuitry • Selected efficient regulators MICROCONTROLLER • Automated pin assignments • Optimized peripheral routing • Generated decoupling network COMMUNICATION • USB serial interface • I2C expansion ports • Debug headers placement SENSORS • Light-dependent resistors • Temperature monitoring • Motion detection The most powerful part for me were these: AI suggested improvements I wouldn't consider if I were a beginner or even intermediate, but are common for advanced design: • Better ground plane distribution • Reduced EMI through strategic routing • Thermal optimization via component placement This tool cuts my design time by 80%. Engineering evolves. Tools improve. We adapt or fall behind. I've documented the entire process in a free roadmap video. I'll share it with anyone who comments below. Serious about accelerating your PCB design workflow? Drop "Flux" in the comments. Like this post if you believe AI assistants will revolutionize hardware design - or at least make it A LOT easier, faster and more accurate.

  • View profile for Philippe Bartissol

    VP @ Dassault Systèmes | Industrial Equipment industry, Virtual Twins, Business Transformation, Mentor

    2,829 followers

    I was talking to a customer the other day and it reminded me of an unavoidable fact: current manufacturing lines are just not set up for today’s products.   Why? 🧩 Products are more complex: E.g., a car in the 1970s might have had 5,000 parts; Today, it’s more like 30,000. ⏩ Markets move faster: 20 years ago, a typical consumer product might be updated every 5 years. Now, it can be replaced as frequently as every 1.5 years. 🥇 Products are more personalized: Demand for customization is surging, with the market value expected to hit $172.5B by 2032 (7.2% CAGR). One response to these trends has been the rise of modular production platforms, where traditional machines and conveyor belts are supplemented with smart robots and AMRs. These production systems are not rigid and linear, but living and dynamic, and offer unparalleled versatility and adaptability.   The challenge is installing and harmonizing these systems. That’s why virtual twins are so necessary in making the factory of tomorrow a reality, enabling line-builders to model the impacts of any changes in unprecedented detail.   Virtual twins also point to a future in which line builders don’t just sell the machinery, but operate as full value partners, helping their customers design, install and continuously adapt these complex ecosystems to meet ever evolving demands.   Learn more about this exciting next frontier of flexible production lines powered by automation, AMRs and virtual twins: https://lnkd.in/eM8c5dD2

Explore categories