_florent_ changed the topic of #litex to: LiteX FPGA SoC builder and Cores / Github : https://github.com/enjoy-digital, https://github.com/litex-hub / Logs: https://libera.irclog.whitequark.org/litex
tpb has quit [Remote host closed the connection]
tpb has joined #litex
Degi has quit [Ping timeout: 256 seconds]
Degi has joined #litex
bl0x has joined #litex
bl0x_ has quit [Ping timeout: 252 seconds]
knicklicht has joined #litex
<_florent_> knicklicht: This should be possible yes, you can find some OpenOCD .cfg files in litex_boards, ex this one should be close to what you want to do: https://github.com/litex-hub/litex-boards/blob/master/litex_boards/prog/openocd_trellisboard.cfg
<knicklicht> Thanks, I will have a look at it. Currently I just use openFPGAloader to load it manually. I finally got everything set up. I also managed to build Zephyr and boot it. Really happy with how nice everything fits together. Next step: Enable I2S cores. It should be possible to have multiple instances of the same core, right?
<_florent_> knicklicht: Great for Zephyr, what was the issue? (could be useful to know what it was is someone has the same issue in the future)
<_florent_> knicklicht: The SoC builder will allow you to integrate multiple I2S cores yes, you will just have to see if https://github.com/enjoy-digital/litex/blob/master/litex/tools/litex_json2dts_zephyr.py can generate .dts for multiple instances.
<knicklicht> I just needed to enable the timer uptime latch with: --timer-uptime . This is not mentioned in any tutorials out there as far as I can tell but the Zephyr build error hints at it
<_florent_> knicklicht: if not, you'll have to edit the script of do manual copy/changes in the .dts.
<_florent_> knicklicht: after this, zephyr should be able to handle the multiple I2S instances
<knicklicht> Perfect
<knicklicht> Ah, a quick search tells me that --timer-uptime was mentioned in the Zephyr guide to get litex vexriscv running on the Arty
<_florent_> Thanks for the info on the issue
<_florent_> I just opened https://github.com/enjoy-digital/litex/issues/1555 to have a closer look at it
FabM has joined #litex
FabM has joined #litex
<MoeIcenowy> _florent_: BTW as I didn't change OpenC906 for a long time, should we create pythondata-cpu-openc906?
shorne has quit [Ping timeout: 260 seconds]
shorne has joined #litex
cr1901_ has joined #litex
cr1901 has quit [Ping timeout: 252 seconds]
<knicklicht> Is there a tutorial on how to add a core to a soc? I found an example repo that shows how to add I2S but it's two years and I am unsure if it's up to date: https://github.com/antmicro/zephyr-on-litex-vexriscv/blob/0775f94d3537ec5e967ea2ac2b6aba0de3fa71af/soc_zephyr.py
<knicklicht> When I try to build I get "i2s_tx Region in IO region, it can't be cached: Origin: 0xb2000000, Size: 0x00040000, Mode: RW, Cached: True Linker: False" caused by: self.add_memory_region("i2s_tx", 0xb2000000, i2s_mem_size). In the example the memory regions are allocated differently. Where can I find out how to do the mapping?
<knicklicht> Okay, I bypassed that by setting the region adress to something before ethmacs bus range. Now I have a more serious problem, the i2s core seems to be designed only for xilinx devices. It directly uses  FIFOSyncMacro from litex.soc.cores.ram.xilinx_fifo_sync_macro. Is there an alternative for the ECP5?
cr1901_ is now known as cr1901
cr1901 has quit [Read error: Connection reset by peer]
cr1901 has joined #litex
rektide has quit [Remote host closed the connection]
FabM has quit [Quit: Leaving]
knicklicht has quit [Quit: Client closed]
knicklicht has joined #litex
Guest64 has joined #litex
Guest64 has quit [Client Quit]
zjason`` is now known as zjason
TMM_ has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
TMM_ has joined #litex
knicklicht has quit [Ping timeout: 260 seconds]
cr1901 has quit [Remote host closed the connection]
cr1901 has joined #litex
cr1901 has quit [Remote host closed the connection]
cr1901 has joined #litex
<_florent_> MoeIcenowy: We should create a pythondata-cpu-openc906 repo yes, I could look at it next week