A one-day immersive workshop where physical sensors speak directly to AI — and you speak directly back to them.
| Parameter | Start | End | Min | Max | Avg | Trend |
|---|---|---|---|---|---|---|
| Gas Resistance | 35,020 Ω | 41,694 Ω | 35,020 Ω | 56,013 Ω | 39,501 Ω | ↑ Improving |
| IAQ Index | 58 | 56 | 50 | 60 | 51 | → Stable |
| eCO₂ | — | — | 498 ppm | 510 ppm | 501 ppm | → Stable |
In a conventional IoT lab, accessing sensor data means writing code, configuring libraries, and parsing serial output — before you can even ask a question. This workshop removes that barrier entirely.
Students spend most of their time on tooling, not understanding.
Students spend their time on understanding, not tooling.
This is not a sensor wiring workshop.
Connect a physical sensor to a live AI-accessible gateway in under 5 minutes using keyed, fool-proof connectors.
Ask about gas levels, temperature trends, motion events, and solar readings — in plain English, and get structured, interpreted answers.
See how a language model interprets live physical-world data through MCP tool calls — the same mechanism powering modern AI agents.
Reason about what readings actually mean — IAQ indices, resistance trends, insolation patterns, occupancy signatures.
Treat sensors as conversation partners, not data sources. This is how the next generation of engineers will build.
Understand BLE sensor architecture, the gateway software stack, and how AI models consume real-time IoT telemetry through structured tool calls.
Interact with a live MCP server, understand how LLMs use tools, and build natural language interfaces to physical sensors — no hardware expertise needed.
Skip wiring complexity entirely. Ask questions in plain language and receive structured, AI-interpreted sensor data from Day 1.
Use the gateway as a data collection platform. Query historical trends, export time-series data, and run comparative experiments using AI-assisted analysis.
Every workshop station is fully equipped. You do not need to bring any hardware.
The gateway exposes an MCP server — a set of tools the AI can call directly, just as it would use a calculator or web search. When you ask a question, the AI orchestrates multiple tool calls and synthesises the results into a coherent answer.
Returns current gas resistance, IAQ index, and eCO₂ readings from the gas sensor.
Returns current temperature reading from the DS18B20 sensor.
Returns current and historical insolation data in W/m².
Returns current occupancy state and timestamped event history.
Returns min, max, average, start, and end values over any time window.
Returns trend direction and rate of change for any sensor parameter.
Returns any threshold violations or statistical anomalies detected.
Returns a holistic environmental assessment synthesised across all four sensors.
get_gas_level → retrieves current IAQ (51), eCO₂ (501 ppm), gas resistance (41,694 Ω)get_trend → confirms gas resistance trending upward (cleaner air), CO₂ stableassess_environment → holistic view confirms Good air quality across all parameters
You are not learning to use sensors.
You are learning to think with them.
Each experiment is a question — answered by the sensor-AI system in real time.
Open one window. Place gas sensors in ventilated and sealed zones. Ask the AI to track how IAQ and eCO₂ diverge over 20 minutes.
Take the insolation sensor to different locations — north-facing window, south-facing rooftop, shaded corner. Ask the AI to compare and estimate harvest potential.
Monitor the lab entrance with the PIR sensor for 30 minutes. Ask: "If we automated lighting based on this data, how much energy would we save?"
Run all four sensors for one hour. Ask: "Give me a complete environmental profile — temperature comfort, air quality grade, solar gain, and occupancy pattern."
Introduce a deliberate change — seal the room, turn off ventilation, or bring a heat source near the temperature sensor. Ask: "Have there been any anomalies in the last 15 minutes?"
Plug each sensor into its labelled port on the gateway enclosure. Keyed connectors prevent incorrect wiring — no datasheets needed.
Apply power to the gateway. All four sensors begin broadcasting data via BLE automatically within seconds.
Open the AI interface on your laptop, connect to the gateway's MCP server, and start asking questions in plain language.
| Feature | This Workshop | Conventional IoT Lab |
|---|---|---|
| Sensor access | Ask in plain language — get structured answers | Write code, parse output, build a dashboard |
| Hardware setup | Plug-and-play keyed connectors | Breadboard wiring, jumper cables, datasheets |
| Data interpretation | AI synthesises readings into actionable insight | Raw numbers — interpretation left to student |
| Time to first insight | Under 5 minutes | Typically an entire lab session |
| AI integration | Native — sensors are AI tools | Typically absent |
| Cloud dependency | None for data; optional for AI model | Often mandatory for platforms |
| Skill acquired | AI-native engineering thinking | Hardware interfacing fundamentals |