Judge my CV and suggest me on which skills I need to work on as 3rd year ECE student.
https://redd.it/1pv9w3d
@r_embedded
https://redd.it/1pv9w3d
@r_embedded
Great books to getting started with C programming for embedded systems
I am a student currently in a university. I have gotten interested in the embedded universe and wanna make it my future goal. I am already studying about how electronics work and am doing a ton of research on what kind of MCU to buy in order to start; for i want to go dorectly to C and learn to control the MCUs.
For this, i need some advice from the people to recommend some very great books and if there ARE good tutorial series on yt then do mention that; books ARE my first priority though.
PEACE 🕊️
https://redd.it/1pvbqjf
@r_embedded
I am a student currently in a university. I have gotten interested in the embedded universe and wanna make it my future goal. I am already studying about how electronics work and am doing a ton of research on what kind of MCU to buy in order to start; for i want to go dorectly to C and learn to control the MCUs.
For this, i need some advice from the people to recommend some very great books and if there ARE good tutorial series on yt then do mention that; books ARE my first priority though.
PEACE 🕊️
https://redd.it/1pvbqjf
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Building a true USB Audio Class 2.0 (UAC2) USB→I²S bridge on Teensy 4.1 or STM32H723 (Amanero-like). Looking for guidance.
Hi everyone,
I’m trying to build a proper USB Audio Class 2.0 (UAC2) USB→I²S bridge, comparable in behavior and stability to Amanero/XMOS-based solutions (async USB, feedback endpoint, no pops/clicks, long-term stable).
Target platforms:
Teensy 4.1 (i.MX RT1062)
STM32H723 (as an alternative)
What I’m aiming for:
USB Device, UAC2 (not UAC1)
Asynchronous mode with feedback endpoint
Stable long-term streaming (no drift, no periodic glitches)
PCM 24/32-bit, ideally up to 192 kHz+ (768 kHz optional, not required)
I²S output with the MCU as clock master
Clean handling of:
Sample-rate changes
USB suspend/resume
Host differences (Windows/macOS/Linux)
No Arduino “black box” abstractions (happy to work register-level)
On Teensy 4.1:
Is extending/reworking existing UAC2 implementations (async feedback, larger buffers) a sane path for a production-quality bridge?
Any known pitfalls with RT1062 USB HS + long-term async feedback stability?
On STM32H723:
Has anyone shipped a stable UAC2 async device (not just demo-grade)?
Is Cube middleware a dead end for this use case, or just needs heavy modification?
General:
Any open-source references that behave close to Amanero/XMOS in terms of clocking discipline and feedback?
Recommended buffer sizes / feedback strategies that work well on Windows without constant tuning?
Thanks in advance. Any pointers, repos, war stories, or “don’t do this, do that instead” advice would be hugely appreciated.
https://redd.it/1pvbprg
@r_embedded
Hi everyone,
I’m trying to build a proper USB Audio Class 2.0 (UAC2) USB→I²S bridge, comparable in behavior and stability to Amanero/XMOS-based solutions (async USB, feedback endpoint, no pops/clicks, long-term stable).
Target platforms:
Teensy 4.1 (i.MX RT1062)
STM32H723 (as an alternative)
What I’m aiming for:
USB Device, UAC2 (not UAC1)
Asynchronous mode with feedback endpoint
Stable long-term streaming (no drift, no periodic glitches)
PCM 24/32-bit, ideally up to 192 kHz+ (768 kHz optional, not required)
I²S output with the MCU as clock master
Clean handling of:
Sample-rate changes
USB suspend/resume
Host differences (Windows/macOS/Linux)
No Arduino “black box” abstractions (happy to work register-level)
On Teensy 4.1:
Is extending/reworking existing UAC2 implementations (async feedback, larger buffers) a sane path for a production-quality bridge?
Any known pitfalls with RT1062 USB HS + long-term async feedback stability?
On STM32H723:
Has anyone shipped a stable UAC2 async device (not just demo-grade)?
Is Cube middleware a dead end for this use case, or just needs heavy modification?
General:
Any open-source references that behave close to Amanero/XMOS in terms of clocking discipline and feedback?
Recommended buffer sizes / feedback strategies that work well on Windows without constant tuning?
Thanks in advance. Any pointers, repos, war stories, or “don’t do this, do that instead” advice would be hugely appreciated.
https://redd.it/1pvbprg
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
STM32F4 USART baremetal driver code help
Hi! I'm a beginner to STM32 and started learning how to write device drivers from scratch using baremetal approach and STM32F411VET6 is the board I'm using for it, I've done blinking LED, interrupt LED succesfully but this USART is causing me a lot of trouble, I've written the driver and checked with usart2 (PA->tx, PA3->rx) to transmit "Hello from usart2" through FTDI232 USB to uart TTL, and I'm not able to get any messages on tera term, but while debugging all the values are assigned properly such as baud rate, Rcc, gpio, and even for transmit it goes 18 times (for the message as mentioned above)through for loop still I'm not able to find what causes this issue . Logic analyser too shows a flat line.(Connections I've given PA2 tx to Rx pin of FTDI and PA3 Rx to Tx, gnd to gnd, set 9600 8N1 format in tera term, chose the right COM port)
Code repo: https://github.com/Rinosh03/STM32DeviceDrivers
https://redd.it/1pvf6ac
@r_embedded
Hi! I'm a beginner to STM32 and started learning how to write device drivers from scratch using baremetal approach and STM32F411VET6 is the board I'm using for it, I've done blinking LED, interrupt LED succesfully but this USART is causing me a lot of trouble, I've written the driver and checked with usart2 (PA->tx, PA3->rx) to transmit "Hello from usart2" through FTDI232 USB to uart TTL, and I'm not able to get any messages on tera term, but while debugging all the values are assigned properly such as baud rate, Rcc, gpio, and even for transmit it goes 18 times (for the message as mentioned above)through for loop still I'm not able to find what causes this issue . Logic analyser too shows a flat line.(Connections I've given PA2 tx to Rx pin of FTDI and PA3 Rx to Tx, gnd to gnd, set 9600 8N1 format in tera term, chose the right COM port)
Code repo: https://github.com/Rinosh03/STM32DeviceDrivers
https://redd.it/1pvf6ac
@r_embedded
STM32 + 7-inch LCD speed test: SPI vs FMC vs LTDC — how far can an MCU really go?
Hi everyone 👋
I've worked with STM32 for quite a while and have always been obsessed with LCD size. I kept wanting it bigger and bigger — and at some point I started wondering: **is 7 inches the largest LCD an STM32 microcontroller can realistically handle?**
Instead of guessing or trusting datasheets, I decided to actually test it.
I built a small setup using a **7-inch LCD** and compared three common display interfaces on STM32:
* **SPI**
* **FMC (parallel)**
* **LTDC**
The goal wasn’t synthetic benchmarks, but **real-world behavior**:
* Full-screen refresh speed
* UI responsiveness
* CPU load and system complexity
* How usable each option feels in practice
I recorded everything in a short YouTube video where you can clearly see the difference in refresh speed and smoothness between the three interfaces, side by side.
🎥 **Video here:** [**https://youtu.be/WyrIYqgV350**](https://youtu.be/WyrIYqgV350)
Some results were expected, but a few things genuinely surprised me — especially where SPI starts to feel painful, and how much difference LTDC makes when driving a larger panel.
I’m sharing this mainly as a **hands-on reference**, not as a “best solution” claim. Different projects have different constraints, and I’m sure many of you have much deeper experience than I do.
https://preview.redd.it/99q0hxgx1e9g1.png?width=1280&format=png&auto=webp&s=62c8a7285ba9bd3dcc6a1c28cc5f05993af3db0d
I’d really like to hear your thoughts:
* Have you ever driven a **7-inch (or larger) LCD** with STM32?
* At what point do you personally stop considering **SPI** for displays?
* Do you prefer **FMC or LTDC**, and why?
* Any optimization tricks you’ve used to squeeze more performance out of STM32 + LCD?
https://reddit.com/link/1pvipmu/video/csk85qm32e9g1/player
Happy to answer questions about the setup or measurements.
Feedback and criticism are very welcome — that’s how we all learn
https://redd.it/1pvipmu
@r_embedded
Hi everyone 👋
I've worked with STM32 for quite a while and have always been obsessed with LCD size. I kept wanting it bigger and bigger — and at some point I started wondering: **is 7 inches the largest LCD an STM32 microcontroller can realistically handle?**
Instead of guessing or trusting datasheets, I decided to actually test it.
I built a small setup using a **7-inch LCD** and compared three common display interfaces on STM32:
* **SPI**
* **FMC (parallel)**
* **LTDC**
The goal wasn’t synthetic benchmarks, but **real-world behavior**:
* Full-screen refresh speed
* UI responsiveness
* CPU load and system complexity
* How usable each option feels in practice
I recorded everything in a short YouTube video where you can clearly see the difference in refresh speed and smoothness between the three interfaces, side by side.
🎥 **Video here:** [**https://youtu.be/WyrIYqgV350**](https://youtu.be/WyrIYqgV350)
Some results were expected, but a few things genuinely surprised me — especially where SPI starts to feel painful, and how much difference LTDC makes when driving a larger panel.
I’m sharing this mainly as a **hands-on reference**, not as a “best solution” claim. Different projects have different constraints, and I’m sure many of you have much deeper experience than I do.
https://preview.redd.it/99q0hxgx1e9g1.png?width=1280&format=png&auto=webp&s=62c8a7285ba9bd3dcc6a1c28cc5f05993af3db0d
I’d really like to hear your thoughts:
* Have you ever driven a **7-inch (or larger) LCD** with STM32?
* At what point do you personally stop considering **SPI** for displays?
* Do you prefer **FMC or LTDC**, and why?
* Any optimization tricks you’ve used to squeeze more performance out of STM32 + LCD?
https://reddit.com/link/1pvipmu/video/csk85qm32e9g1/player
Happy to answer questions about the setup or measurements.
Feedback and criticism are very welcome — that’s how we all learn
https://redd.it/1pvipmu
@r_embedded
YouTube
STM32 + 7 Inches LCD Speed Test — SPI vs FMC vs LTDC | Real Performance Comparison | LVGL & TouchGFX
In this video, we compare the real performance of three popular display interfaces on the STM32 platform — SPI, FMC, and LTDC — using a 7-inch LCD.
You’ll see how each interface affects refresh rate, frame rate (FPS), CPU usage, and overall UI smoothness…
You’ll see how each interface affects refresh rate, frame rate (FPS), CPU usage, and overall UI smoothness…
ARM64 and X86_64 AI Audio Classification (521 Classes, YAMNet)
https://AudioClassify.com
https://redd.it/1pvmu6m
@r_embedded
https://AudioClassify.com
https://redd.it/1pvmu6m
@r_embedded
Which microcontroller should I use?
Hello, I’m a 20yo CS French student and I started a RC Car project. My goal is high speed (with adapted chassis, stability etc) and adding some other features like PID, maybe torque vectoring later etc. I’m pretty limited in budget due to my status and by the size of the board because my car will be only \~30cm long. I looked for STM32H7 but it’s expensive, Pi Pico 2 but some think PIO is too hard to use, Teensy 4.1 is a bit expensive but why not and finally, ESP32-S3. Programming doesn’t scare me but I’m not good enough in electronics to talk about a custom PCB etc. Sorry for my English, and thanks!
https://redd.it/1pvnool
@r_embedded
Hello, I’m a 20yo CS French student and I started a RC Car project. My goal is high speed (with adapted chassis, stability etc) and adding some other features like PID, maybe torque vectoring later etc. I’m pretty limited in budget due to my status and by the size of the board because my car will be only \~30cm long. I looked for STM32H7 but it’s expensive, Pi Pico 2 but some think PIO is too hard to use, Teensy 4.1 is a bit expensive but why not and finally, ESP32-S3. Programming doesn’t scare me but I’m not good enough in electronics to talk about a custom PCB etc. Sorry for my English, and thanks!
https://redd.it/1pvnool
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Developing IoT Projects with ESP32 - Second Edition PDF
Hi everyone,
I’m studying ESP32 for an IoT project and I’m trying to find the book “Developing IoT Projects with ESP32 – Second Edition” in PDF format for learning purposes.
If someone can help or point me to a resource, I’d be very grateful.
Thank you!
https://redd.it/1pvpbn6
@r_embedded
Hi everyone,
I’m studying ESP32 for an IoT project and I’m trying to find the book “Developing IoT Projects with ESP32 – Second Edition” in PDF format for learning purposes.
If someone can help or point me to a resource, I’d be very grateful.
Thank you!
https://redd.it/1pvpbn6
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Renesas e² studio / SSP install fails on macOS and Windows ARM VM (GetLastError 317)
I’m trying to develop and flash firmware for a Renesas Synergy DK-S7G2 board, but I’m completely blocked by tooling issues across macOS and Windows ARM (VM). For context I am using an M4 MacBook Pro
I want to confirm whether what I’m attempting is fundamentally unsupported, or if I’m missing a required step.
When I tried on the mac, it saud App fails to launch (“E2studio is damaged and can’t be opened”), also Synergy SSP is not available on macOS, so Synergy projects and flashing are unsupported even if e² studio runs.
Even on Windows 11 ARM VM (UTM / QEMU on Mac), Windows installs successfully, USB passthrough works, DK-S7G2 enumerates.
The LEDs on the board indicate power/debug connection
SSP v2.7.0 + e² studio installer fails with
InvocationTargetException
GetLastError() returned 317
The same error occurs when installing e² studio alone
Tried running as Administrator, changing install path, freeing disk space, reinstalling prerequisites, disabling firewall and still met with no success
https://redd.it/1pvq0pb
@r_embedded
I’m trying to develop and flash firmware for a Renesas Synergy DK-S7G2 board, but I’m completely blocked by tooling issues across macOS and Windows ARM (VM). For context I am using an M4 MacBook Pro
I want to confirm whether what I’m attempting is fundamentally unsupported, or if I’m missing a required step.
When I tried on the mac, it saud App fails to launch (“E2studio is damaged and can’t be opened”), also Synergy SSP is not available on macOS, so Synergy projects and flashing are unsupported even if e² studio runs.
Even on Windows 11 ARM VM (UTM / QEMU on Mac), Windows installs successfully, USB passthrough works, DK-S7G2 enumerates.
The LEDs on the board indicate power/debug connection
SSP v2.7.0 + e² studio installer fails with
InvocationTargetException
GetLastError() returned 317
The same error occurs when installing e² studio alone
Tried running as Administrator, changing install path, freeing disk space, reinstalling prerequisites, disabling firewall and still met with no success
https://redd.it/1pvq0pb
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Project Advice/MCU Selection for camera recording with overlay
Hi all,
For context I'm an embedded developer by profession, but this level of project is a bit out of my scope, I'm generally down in the bare-metal/FreeRTOS end of things, not the embedded linux end of the scale.
I have a project in mind that will take input from a single MIPI CSI (or USB, I don't really have a preference) camera, add some overlay graphics and text to the stream and save to SD card (or USB stick). Overlay graphics are generated from environmental sensors or incoming serial data (e.g NMEA GPS data). There is no requirement for video output, but it's a nice to have. Let's assume that 1080p quality is sufficient and I assume that encoding would be h.264 or h.265.
I'm assuming for now that the environmental sensors will mostly come "for free" in terms of MCU hardware and resources. Anything that can do the video heavy lifting will have enough to read a few SPI/I2C sensors and a UART or two.
I don't have a good handle on where to start for selecting an MCU. I can prototype this on a Pi, but I would be looking to integrate it onto a single PCB ideally. Maybe using the Raspberry Pi compute modules on a carrier are the way to go, but I would be open to a single PCB design either from scratch or based on a suitable open source design.
Any advice people can give around MCU selection would be helpful, especially with regard to:
- suitability for use with embedded linux (I'm assuming I would use yocto/buildroot to generate the OS).
- hardware support for video and associated software/kernel support
- existing open source designs
Thanks all and Merry Christmas!
https://redd.it/1pvq04l
@r_embedded
Hi all,
For context I'm an embedded developer by profession, but this level of project is a bit out of my scope, I'm generally down in the bare-metal/FreeRTOS end of things, not the embedded linux end of the scale.
I have a project in mind that will take input from a single MIPI CSI (or USB, I don't really have a preference) camera, add some overlay graphics and text to the stream and save to SD card (or USB stick). Overlay graphics are generated from environmental sensors or incoming serial data (e.g NMEA GPS data). There is no requirement for video output, but it's a nice to have. Let's assume that 1080p quality is sufficient and I assume that encoding would be h.264 or h.265.
I'm assuming for now that the environmental sensors will mostly come "for free" in terms of MCU hardware and resources. Anything that can do the video heavy lifting will have enough to read a few SPI/I2C sensors and a UART or two.
I don't have a good handle on where to start for selecting an MCU. I can prototype this on a Pi, but I would be looking to integrate it onto a single PCB ideally. Maybe using the Raspberry Pi compute modules on a carrier are the way to go, but I would be open to a single PCB design either from scratch or based on a suitable open source design.
Any advice people can give around MCU selection would be helpful, especially with regard to:
- suitability for use with embedded linux (I'm assuming I would use yocto/buildroot to generate the OS).
- hardware support for video and associated software/kernel support
- existing open source designs
Thanks all and Merry Christmas!
https://redd.it/1pvq04l
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
STM32H7B3I-DK and ESP32 s3 communication
Hello guys, I need some help. I want to build a UI with Square Line Studio on my STM32H7B3I-DK and communicate it with an ESP32-S3 N16R8. The STM32 will handle the GUI and read data from the ESP32 (where I will handle all the logic and states). The STM32 is only responsible for displaying the interface.
I have encountered some problems along the way. This is my first project on STM32, and I don’t really know how to, for example, enable SPI2 without breaking the whole Square Line Studio project. Any help or tips would be greatly appreciated.
https://redd.it/1pvps0r
@r_embedded
Hello guys, I need some help. I want to build a UI with Square Line Studio on my STM32H7B3I-DK and communicate it with an ESP32-S3 N16R8. The STM32 will handle the GUI and read data from the ESP32 (where I will handle all the logic and states). The STM32 is only responsible for displaying the interface.
I have encountered some problems along the way. This is my first project on STM32, and I don’t really know how to, for example, enable SPI2 without breaking the whole Square Line Studio project. Any help or tips would be greatly appreciated.
https://redd.it/1pvps0r
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Skills to develop during engineering course
Hello everyone,
As an electronics student who will graduate in 2028. What are the skills i should develop if my interest is around embedded and firmware side.
As ai is improving did it change any aspects of your job in this field.
Also do companies like NXP, intel, Qualcomm, bosch, TI conduct hackathons and internships related to this? If they do, what are the prerequisites to attend them. Also, what do interviewers like to see in cv and expect us to know and what should I be focusing on before I graduate?
https://redd.it/1pvv4ua
@r_embedded
Hello everyone,
As an electronics student who will graduate in 2028. What are the skills i should develop if my interest is around embedded and firmware side.
As ai is improving did it change any aspects of your job in this field.
Also do companies like NXP, intel, Qualcomm, bosch, TI conduct hackathons and internships related to this? If they do, what are the prerequisites to attend them. Also, what do interviewers like to see in cv and expect us to know and what should I be focusing on before I graduate?
https://redd.it/1pvv4ua
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Question about DDR3L DQS/DM byte lanes
Hi everyone, I came across this question when trying to route a board with DDR3L for the first time.
The DDR3L IC (MT41K256M16TW-107:P) has UDM/LDM, and UDQS/LDQS, and I am not sure whether to put the 'L' pins on byte lane 0, or byte lane 1.
To be more clear, by "byte lane 0", I refer to the byte lane that is connected to DQ[0:7\], and byte lane 1 refers to the one connected to DQ[8:15\].
I am seeing some conflicting information about what goes where.
This example puts LDM/LDQS to byte lane 0, https://github.com/fma23/XADC\_Zynq7000/blob/master/ZedBoard\_RevC.1\_Schematic.pdf
And this one does the opposite https://youtu.be/W3Jt\_y6PHjA?list=PLOWdivEsxi3s5c\_atSD8vQ8xYINmHR4Qm&t=249
And this one also does the opposite, the same as the phils lab video https://hforsten.com/img/pulsed/pulsed\_schematic.pdf
I would appreciate if someone could give some insight on why these different projects switch these pins, and whether it matters.
https://redd.it/1pvwdpm
@r_embedded
Hi everyone, I came across this question when trying to route a board with DDR3L for the first time.
The DDR3L IC (MT41K256M16TW-107:P) has UDM/LDM, and UDQS/LDQS, and I am not sure whether to put the 'L' pins on byte lane 0, or byte lane 1.
To be more clear, by "byte lane 0", I refer to the byte lane that is connected to DQ[0:7\], and byte lane 1 refers to the one connected to DQ[8:15\].
I am seeing some conflicting information about what goes where.
This example puts LDM/LDQS to byte lane 0, https://github.com/fma23/XADC\_Zynq7000/blob/master/ZedBoard\_RevC.1\_Schematic.pdf
And this one does the opposite https://youtu.be/W3Jt\_y6PHjA?list=PLOWdivEsxi3s5c\_atSD8vQ8xYINmHR4Qm&t=249
And this one also does the opposite, the same as the phils lab video https://hforsten.com/img/pulsed/pulsed\_schematic.pdf
I would appreciate if someone could give some insight on why these different projects switch these pins, and whether it matters.
https://redd.it/1pvwdpm
@r_embedded
GitHub
XADC_Zynq7000/ZedBoard_RevC.1_Schematic.pdf at master · fma23/XADC_Zynq7000
Contribute to fma23/XADC_Zynq7000 development by creating an account on GitHub.
Advice for embedded role at seed stage startup
Hello! I recently finished my MS in ECE and have ~3 YoE in embedded systems (DSP, FPGA, bare-metal) at a defense contractor, plus a FAANG embedded internship from undergrad. My current role is stable but has felt a bit stagnant, so I’ve been window shopping for other jobs. Also looking for a pay bump and something exciting.
I recently interviewed with a seed-stage aerospace startup (orbital semiconductor manufacturing), and the process has gone very well. However the team is extremely small, only two full-time engineers (roughly similar experience to me) plus a few consultants.
What makes me excited: the high ownership and autonomy, roles across board bring-up/comms/and telemetry, code getting flight-tested almost monthly, strong founding team and potential long-term network.
What makes me worried: I’ve never worked on a team this small (and no way to “switch teams” if it gets bad), no established processes or senior engineers to lean on day-to-day, not sure how to think of comp packages.
For those who’ve made a similar jump (especially early-career engineers or hardware folks):
1. What do you wish you had evaluated more carefully before joining?
2. What questions should I ask founders that go beyond the usual “vision” stuff?
3. What compensation or equity terms tend to matter most in practice at this stage?
4. What were the biggest red flags you ignored early on and paid for later?
5. How did this kind of role impact your long-term career trajectory (positively or negatively)?
Thanks!
https://redd.it/1pvxl8h
@r_embedded
Hello! I recently finished my MS in ECE and have ~3 YoE in embedded systems (DSP, FPGA, bare-metal) at a defense contractor, plus a FAANG embedded internship from undergrad. My current role is stable but has felt a bit stagnant, so I’ve been window shopping for other jobs. Also looking for a pay bump and something exciting.
I recently interviewed with a seed-stage aerospace startup (orbital semiconductor manufacturing), and the process has gone very well. However the team is extremely small, only two full-time engineers (roughly similar experience to me) plus a few consultants.
What makes me excited: the high ownership and autonomy, roles across board bring-up/comms/and telemetry, code getting flight-tested almost monthly, strong founding team and potential long-term network.
What makes me worried: I’ve never worked on a team this small (and no way to “switch teams” if it gets bad), no established processes or senior engineers to lean on day-to-day, not sure how to think of comp packages.
For those who’ve made a similar jump (especially early-career engineers or hardware folks):
1. What do you wish you had evaluated more carefully before joining?
2. What questions should I ask founders that go beyond the usual “vision” stuff?
3. What compensation or equity terms tend to matter most in practice at this stage?
4. What were the biggest red flags you ignored early on and paid for later?
5. How did this kind of role impact your long-term career trajectory (positively or negatively)?
Thanks!
https://redd.it/1pvxl8h
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
UWB For Local Positioning Recommendations
Can anyone recommend a brand of UWB chip for local positioning (<50m) they've had a good (or just OK) time using. <30cm accuracy is sufficient
https://redd.it/1pvyy1b
@r_embedded
Can anyone recommend a brand of UWB chip for local positioning (<50m) they've had a good (or just OK) time using. <30cm accuracy is sufficient
https://redd.it/1pvyy1b
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Advice needed: Embedded Linux bringup on a custom PCB
Hello everyone,
Embedded software guy here but new to linux board bringup. I have a development kit that runs linux (Renesas RZ V2N) and we have the schematic as well as a working Yocto build for this board. How do we go about bringing up a custom PCB if we do not copy the reference schematic because it uses bigger/more expensive components?
For example, if we wish to use a 2GB DDR instead of the original 8 GB DDR on the EVK - from a software perspective, where do I make the change to support the new chip? Is it the second stage bootloader - the one after ROM code (ARM TF-A)? Where do I find the RAM initialisation and training code? What needs to change if let's say I choose a different emmc and NOR flash memories?
In short, what are the "gotchas" that might prevent my custom PCB with slightly different components than the reference design from booting into Linux ? I am working with a hardware guy too who will be handling the PCB work but I need to make sure that the I am able to patch SPL/uboot to make it work with the custom parts we choose.
https://redd.it/1pw09lz
@r_embedded
Hello everyone,
Embedded software guy here but new to linux board bringup. I have a development kit that runs linux (Renesas RZ V2N) and we have the schematic as well as a working Yocto build for this board. How do we go about bringing up a custom PCB if we do not copy the reference schematic because it uses bigger/more expensive components?
For example, if we wish to use a 2GB DDR instead of the original 8 GB DDR on the EVK - from a software perspective, where do I make the change to support the new chip? Is it the second stage bootloader - the one after ROM code (ARM TF-A)? Where do I find the RAM initialisation and training code? What needs to change if let's say I choose a different emmc and NOR flash memories?
In short, what are the "gotchas" that might prevent my custom PCB with slightly different components than the reference design from booting into Linux ? I am working with a hardware guy too who will be handling the PCB work but I need to make sure that the I am able to patch SPL/uboot to make it work with the custom parts we choose.
https://redd.it/1pw09lz
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Flyback converter: Adding another secondary for MCU with primary-side (non-isolated) GND?
Hi everyone,
I’m working on a **400V → 12V flyback converter** using a custom transformer.
Current transformer windings:
* 1× Primary (HV side)
* 1× Auxiliary
* 1× Secondary (isolated) → used to generate **12V output**
Now I need **another low-voltage supply** to power an **MCU and some Relays**.
**Important constraint:**
The MCU **must share the same GND as the primary side (non-isolated)** because it needs to directly control components on the non-isolated side of the circuit.
My questions:
1. Can I **add an extra secondary winding** to the transformer and **reference it to the primary-side GND** to power the MCU and Relay?
2. If I do this, does that winding still count as a “secondary,” or is it effectively a **primary-referenced auxiliary winding**?
3. Are there **safety, EMI, or regulation issues** with having both:
* an **isolated secondary (12V output)**, and
* a **non-isolated low-voltage winding** (MCU supply) on the same flyback transformer?
Any reference designs, application notes, or practical advice would be really helpful.
Thanks in advance!
https://redd.it/1pw1c5i
@r_embedded
Hi everyone,
I’m working on a **400V → 12V flyback converter** using a custom transformer.
Current transformer windings:
* 1× Primary (HV side)
* 1× Auxiliary
* 1× Secondary (isolated) → used to generate **12V output**
Now I need **another low-voltage supply** to power an **MCU and some Relays**.
**Important constraint:**
The MCU **must share the same GND as the primary side (non-isolated)** because it needs to directly control components on the non-isolated side of the circuit.
My questions:
1. Can I **add an extra secondary winding** to the transformer and **reference it to the primary-side GND** to power the MCU and Relay?
2. If I do this, does that winding still count as a “secondary,” or is it effectively a **primary-referenced auxiliary winding**?
3. Are there **safety, EMI, or regulation issues** with having both:
* an **isolated secondary (12V output)**, and
* a **non-isolated low-voltage winding** (MCU supply) on the same flyback transformer?
Any reference designs, application notes, or practical advice would be really helpful.
Thanks in advance!
https://redd.it/1pw1c5i
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
Any embedded engineers that got hired from asia to europe?
do they hire embedded engineers abroad in europe from asia,I have only ever seen software engineers get hired
Is it possible?
https://redd.it/1pw3x07
@r_embedded
do they hire embedded engineers abroad in europe from asia,I have only ever seen software engineers get hired
Is it possible?
https://redd.it/1pw3x07
@r_embedded
Reddit
From the embedded community on Reddit
Explore this post and more from the embedded community
STM32 Cube IDE I2C connection cannot be enabled
I greet the entire community.
I'm trying to use I2C, but as shown in the image, this part cannot be changed in any way. I can't figure out why. How can I fix this?
https://preview.redd.it/nli3ezquwj9g1.png?width=1638&format=png&auto=webp&s=20cf12554db078a664f041135bd7709ea866c25b
https://redd.it/1pw4ftl
@r_embedded
I greet the entire community.
I'm trying to use I2C, but as shown in the image, this part cannot be changed in any way. I can't figure out why. How can I fix this?
https://preview.redd.it/nli3ezquwj9g1.png?width=1638&format=png&auto=webp&s=20cf12554db078a664f041135bd7709ea866c25b
https://redd.it/1pw4ftl
@r_embedded
STM32 gotcha: disabling timer outputs (MOE=0) disables its internal outputs too
It is documented, but I still burned MOSFETs due to this, and it is not on the famous gotchas list. So I'm posting this :-) .
Some stm32 microcontrollers have OPAMPs that can have alternate inputs, configured by a timer PWM output. For instance, when TIM1 CH6 is low, the OPAMP looks at one pin. When CH6 is high, the OPAMP looks at another pin.
On the stm32g4, the OPAMPs can multiplex using either TIM1 CH6, TIM8 CH6 or TIM20 CH6. Some online documentation and application notes use TIM1 CH6 in their example.
Gotcha: If you ever disable the TIM1 outputs, for instance with LL_TIM_DisableAllOutputs(TIM1), the TIM1 PWM outputs become all low (expected), and TIM1 CH6 becomes stuck to low (unexpected!). So, this breaks the OPAMP multiplexing.
Workaround: Use TIM8 CH6 for multiplexing, or, if TIM8 may also be disabled, use TIM20 CH6 for OPAMP multiplexing. Ensure that LL_TIM_EnableAllOutputs(TIM20) is called.
By the way, does anyone know why the stm32 microcontrollers have so many gotchas? Or does a list like that exist for other families of microcontrollers too?
https://redd.it/1pw5frk
@r_embedded
It is documented, but I still burned MOSFETs due to this, and it is not on the famous gotchas list. So I'm posting this :-) .
Some stm32 microcontrollers have OPAMPs that can have alternate inputs, configured by a timer PWM output. For instance, when TIM1 CH6 is low, the OPAMP looks at one pin. When CH6 is high, the OPAMP looks at another pin.
On the stm32g4, the OPAMPs can multiplex using either TIM1 CH6, TIM8 CH6 or TIM20 CH6. Some online documentation and application notes use TIM1 CH6 in their example.
Gotcha: If you ever disable the TIM1 outputs, for instance with LL_TIM_DisableAllOutputs(TIM1), the TIM1 PWM outputs become all low (expected), and TIM1 CH6 becomes stuck to low (unexpected!). So, this breaks the OPAMP multiplexing.
Workaround: Use TIM8 CH6 for multiplexing, or, if TIM8 may also be disabled, use TIM20 CH6 for OPAMP multiplexing. Ensure that LL_TIM_EnableAllOutputs(TIM20) is called.
By the way, does anyone know why the stm32 microcontrollers have so many gotchas? Or does a list like that exist for other families of microcontrollers too?
https://redd.it/1pw5frk
@r_embedded