azonenberg changed the topic of #scopehal to: libscopehal, libscopeprotocols, and glscopeclient development and testing | https://github.com/glscopeclient/scopehal-apps | Logs: https://libera.irclog.whitequark.org/scopehal
<Johnsel> I'm happy to see what it does on Windows but I think there's still some discrepancy between my changes and bvernoux'
Degi_ has joined #scopehal
Degi has quit [Ping timeout: 244 seconds]
Degi_ is now known as Degi
<Johnsel> azonenberg glslang_c_interface.h: No such file or directory
<Johnsel> known issue to you?
<azonenberg> Johnsel: that's one of the files pulled in as a dependency by vkFFT
<azonenberg> grab that https://github.com/DTolm/VkFFT, compile it, probably using -DCMAKE_INSTALL_PREFIX=/usr so it installs system wide
<azonenberg> then make install
<azonenberg> another case of docs not being up to date w/ the bleeding edge code
<Johnsel> buckle up guys, we're on the bleeding edge
<Johnsel> and gals!
<azonenberg> (I also have some un-pushed dev work that might help or break some of this)
<azonenberg> and then lain has arm64 porting work to merge soon too
<azonenberg> things are gonna be in flux for a week or two while some of this stabilizes
<Johnsel> how's that going? and is it useful to spin up a ci instance and see if we can get feedback on the ci process there too while you work?
<Johnsel> because it would be annoying if you end up thinking up things that are neigh impossible to automate
<Johnsel> oh no nvm
<Johnsel> that's not an actual m1 box
<azonenberg> So we're starting by testing an arm64 linux VM on an m1
<azonenberg> and a pi4
<azonenberg> it compiles under that platform but we cant run it yet as we don't have the vulkan renderer up
<azonenberg> and none of those platforms have gl 4.3
<azonenberg> so next step will be getting the vulkan renderer up
<azonenberg> once we have vulkan + arm64 linux working, we'll look at osx specific porting
<Johnsel> keep in mind there's a good chance vulkan in that vm is a no-go
<Johnsel> I don't think there is a proper 3d accelerated GPU available in the vm software
<Johnsel> what are you using?
<azonenberg> parallels. vulkan in the vm is not expected to work
<azonenberg> it was just a way to save time getting things to compile vs doing dev on the pi
<Johnsel> ah yeah for sure that makes sense
<Johnsel> I'm interested to see how it'll perform on the rpi
<Johnsel> are you also having an issue with the GL end of things though?
<azonenberg> At this point, in the VM
<azonenberg> it compiles, and it runs to the point of displaying the GTK chrome with no WaveformArea's
<azonenberg> if you attempt to create a WaveformArea, it will most likely choke because it tries to create a GL 4.3 context
<azonenberg> (this is with un-merged dev code lain is working on)
<azonenberg> The plan is to merge those changes fairly soon, it's basically just #ifdef __x86_64__ around all of the AVX code
<azonenberg> then she'll switch dev to a linux x86 box with an nvidia card (i.e. a fully supported platform today) and transition the renderer to vulkan
<azonenberg> Once that's done, keeping GL 2.x for the final compositing pass, we'll try to get that running on a pi
<azonenberg> fingers crossed it will work out of the box
<azonenberg> after any bugs we encounter there are fixed, next step is going to be trying to build it on m1 natively with moltenvk
<azonenberg> and see what happens
<azonenberg> In parallel with that, my focus is on replacing all of the OpenCL accelerated filters with Vulkan implementations
<azonenberg> writing new accelerated filters that were previously not accelerated (e.g. sinc upsampling)
<azonenberg> and preparing to completely stop using clFFT and FFTS
<Johnsel> I see, well good luck to you both.
<azonenberg> Yeah. i expect there will be rough edges and snags
<azonenberg> and there will likely be breakage and issues where the CI build and/or documnetation are not keeping up with the latest code
<Johnsel> It would be useful to at least log changes in such a way that they can be easily replicated
<_whitenotifier-7> [scopehal] azonenberg assigned issue #455: Add #ifdef guards to disable all AVX optimizations on non-x86 platforms - https://github.com/glscopeclient/scopehal/issues/455
Johnsel has quit [Remote host closed the connection]
<_whitenotifier-7> [scopehal] lainy created branch arm64 - https://github.com/glscopeclient/scopehal
<_whitenotifier-7> [scopehal-apps] lainy created branch arm64 - https://github.com/glscopeclient/scopehal-apps
<_whitenotifier-7> [scopehal-apps] azonenberg opened issue #482: Make sure vkFFT is correctly detected by CMake for all supported/upcoming platforms - https://github.com/glscopeclient/scopehal-apps/issues/482
<_whitenotifier-7> [scopehal-apps] azonenberg labeled issue #482: Make sure vkFFT is correctly detected by CMake for all supported/upcoming platforms - https://github.com/glscopeclient/scopehal-apps/issues/482
<_whitenotifier-7> [scopehal] lainy opened pull request #675: Add #ifdefs around x86-specific code. - https://github.com/glscopeclient/scopehal/pull/675
<_whitenotifier-7> [scopehal-apps] lainy opened pull request #483: Building on arm64 - https://github.com/glscopeclient/scopehal-apps/pull/483
bvernoux has joined #scopehal
<bvernoux> In latest glscopeclient we have build error because of
<bvernoux> scopehal-apps/lib/scopehal/VulkanInit.cpp:36:10: fatal error: glslang_c_interface.h: No such file or directory
<bvernoux> I'm searching in the VulkanSDK but that include is not available
<bvernoux> Which it seems shall be included in future VulkanSDK
<bvernoux> To be confirmed
<azonenberg> bvernoux: that should be installed in the SD
<azonenberg> SDK*
<azonenberg> glslang-dev package
<bvernoux> but it is not in latest official version
<bvernoux> ha really ?
<azonenberg> huh, windows vs linux?
<bvernoux> you think it is an option
<azonenberg> i have it, i know that for sure
<azonenberg> but i thought it was installed as part of the sdk
<azonenberg> it may not be on windows
<bvernoux> yes it is not today
<bvernoux> maybe it is just an option to set
<azonenberg> there's a lot more stuff coming in the next few days, i have some un-pushed stuff that depends on vkFFT
<bvernoux> strange as we are using exactly same option for linux/windows in the CI
<azonenberg> tl;dr we have to pull in the glsl compiler even though we're compiling shaders from glsl to spir-v at compile time
<azonenberg> because vkFFT does JIT generation of shader code for each size of fft
<bvernoux> I have removed glslang-dev from Ubuntu in CI
<bvernoux> as I was sure it was part of VulkanSDK ;)
<bvernoux> so that explain the CI build also fail for Linux
<azonenberg> Yeah. also i had some code previously that assumed vkFFT was installed systemwide, idk if i ever pushed it or if it's an intermediate commit in my local tree
<bvernoux> so it could be a simple fix
<azonenberg> but i'm now including vkFFT as a submoudule
<azonenberg> it's a single header file basically
<bvernoux> It seems for msys2/mingw64 it exist too https://packages.msys2.org/base/mingw-w64-glslang
<bvernoux> let wait for your final stuff on vkFTT before to fix it so
<bvernoux> I will try on my fork anyway
<azonenberg> Yeah. I can push now if you want to play with it, althoguh it's incomplete. it does work but doesnt actually use vkFFT for anything
<azonenberg> only the FFT filter itself (not de-embed, channel emulation, spectrogram, or anythign else that does FFTs) is affected so far
<azonenberg> it creates a vkFFT context but does not actually use it
<azonenberg> we use Vulkan to do the window function
<azonenberg> then pass the windowed data to FFTS for postprocessing
<azonenberg> then normalize in software
<azonenberg> the next step is going to be doing the actual FFT in vulkan which is coming later today after i do some other stuff
<bvernoux> there is no hurry I let you finish the stuff
<azonenberg> Ok
<azonenberg> Yeah i'm bouncing between this, probe stuff, and some microscope sample prep
<bvernoux> Has been created on 24 Dec 2019
<bvernoux> it is pretty old but it is not part of VulkanSDK which is very strange
<bvernoux> You could be interested to install optional Volk Header, source and library too as it is an option for VulkanSDK
<bvernoux> and also later SDL2 libraries and headers
<bvernoux> potentially also Vulkan Memory Allocator library
<bvernoux> could be interesting
<bvernoux> TO avoid other external dependencies as everything can be easy installed with VulkanSDK
<azonenberg> TFW i write a unit test for my WIP GPU fft filter and instead end up catching what appears to be a bug in my AVX accelerated Blackman-Harris window function
<azonenberg> welp
<d1b2> <Darius> heh
<_whitenotifier-7> [scopehal] bvernoux commented on pull request #509: GTK-less scopehal - https://github.com/glscopeclient/scopehal/pull/509#issuecomment-1236093878
<_whitenotifier-7> [scopehal] bvernoux commented on pull request #511: rigol function generator - https://github.com/glscopeclient/scopehal/pull/511#issuecomment-1236093904
<azonenberg> Hmmmm
<azonenberg> So i did find and fix a small bug. what i'm now seeing is a *slight* error between the two
<azonenberg> up to about 2.3%
<azonenberg> i suspect it's probably the AVX accelerated transcendental library i'm using
<azonenberg> Which i had never actually evaluated for numerical accuracy before
<azonenberg> hmmmmm
<azonenberg> very, very interesting
<azonenberg> the error is sinusoidal
<azonenberg> that seems to point to an actual bug
Fridtjof has quit [Ping timeout: 264 seconds]
Stary has quit [Ping timeout: 268 seconds]
Stary has joined #scopehal
Fridtjof has joined #scopehal
Johnsel has joined #scopehal
<Johnsel> bvernoux did you see my working build?
<Johnsel> as in, working in the CI
<bvernoux> ha no
<bvernoux> Do you have fixed latest issue ?
<Johnsel> no that issue is still unfixed
<Johnsel> but check that out
<Johnsel> it does approach the environment variables in a different way from yours so we'll have to decide what to do with that
<bvernoux> if it is simpler it is better ;)
<bvernoux> my variables was very far to be perfect
<bvernoux> I do not understand why do you add each time
<bvernoux> shell: msys2 {0}
<bvernoux> as implicitely it is what is done for each step
<bvernoux> but lot of things are strange with GitHub CI
<bvernoux> Windows:
<bvernoux> runs-on: windows-latest
<bvernoux> shell: msys2 {0}
<bvernoux> run:
<bvernoux> defaults:
<bvernoux> it was intended to use by default msys2
<bvernoux> I see you have changed that
<Johnsel> yep and I think that that is not working correctly because that really seems to be the fix
<Johnsel> I don't think you can default to that shell because the first steps will not have that shell exist
<Johnsel> which is probably why it starts up different new shells or loses the reference to it
<Johnsel> i'm not sure, but it really seems to be the fix
<bvernoux> yes it seems your fix is the right one and I prefer when it is exlicit in all case
<bvernoux> you can do a PR to fix that
<Johnsel> I also don't have the shell path issues you had I just have to set the one environment variable to point it to the vulkan sdk
<bvernoux> next step will be to fix VkFFT
<Johnsel> though that is not entirely proven since we have a non-building code right now
<Johnsel> yes I think waiting for that and confirming it works good is best
<bvernoux> yes but in any case it is better to add explicitely which shell is used
<Johnsel> first fix vkFFT and then PR
<bvernoux> GitHUB CI have very strange issues
<Johnsel> because my code is a very old commit
<Johnsel> from the first time vulkan got added, 10 days ago or so
<bvernoux> you see the persistant settings works for windows and do not work for ubuntu-latest
<bvernoux> very strange too
<Johnsel> yes lots of weirdness, the msys2 on windows does not help either
<bvernoux> I have avoided that in ubuntu build and it works but it is not very clear why
<bvernoux> I'm also very far to be an expert of GitHub CI Build ;)
<Johnsel> it's very easy to lose stuff in the CI, either env vars or output or shells
<bvernoux> yes env vars not persistent is really crazy
<Johnsel> I tried spinning up a -local- windows instance + builder
<bvernoux> like the patg
<bvernoux> path
<Johnsel> neigh impossible, it is entirely built to spin up azure cloud vms only
<Johnsel> here check this out
<bvernoux> ha you tell that you do not reproduce that issue when using a local instance ?
<bvernoux> which is intended to be indentical to GitHub CI build stuff ?
<Johnsel> their script parses https://github.com/actions/runner-images/blob/main/images/win/windows2022.json and spins up an azure vm
<Johnsel> and then runs 30 powershell scripts remotely
<Johnsel> with those parameters
<bvernoux> ha interesting
<Johnsel> those 2 start the process
<bvernoux> next step will be to check how to optimize the build time using cached stuff or prebuilt
<Johnsel> it's all build on Packer
<bvernoux> I do not really know how to do that with GitHub CI
<Johnsel> it's fucking disastrous
<Johnsel> yeah we should definitely do that
<Johnsel> simplest is just to cache files
<Johnsel> and restore the file cache
<Johnsel> we can cache the sdk, previous build
<Johnsel> do msi build in 1 step
<Johnsel> and only do fresh builds on certain times/actions
<bvernoux> yes as the build time is horrible today
<Johnsel> so a simple commit just does an incremental build off of the last build
<Johnsel> that way we can at least go down to sub 30 minute builds hopefully
<bvernoux> It will be clearly a must have as soon as possible
<Johnsel> it will also be possible to spend money on faster instances
<Johnsel> there is a public beta right now
<Johnsel> no GPUs though :(
<bvernoux> Yes Andrew plan to do offline stuff
<bvernoux> later
<bvernoux> to even run stuff on the real hardware supporting fully vulkan ...
<Johnsel> Yes that is mostly based on my ideas
<Johnsel> though the windows runner is definitely a problem for that
<bvernoux> as so far GitHub instance have no GPU and other stuff especially the free stuff
<bvernoux> ha ok it was your idea
<Johnsel> although it is not impossible to build a windows runner but it will be a lot of work
<Johnsel> the other platforms it is easier to run local builders
<Johnsel> at least m1 was very easy
<bvernoux> ha interesting
<Johnsel> maybe linux has some weird requirements but I would expect not
<bvernoux> so today the windows local builder is broken ?
<Johnsel> the windows runner has all sorts of expectations of the system having things installed in certain places
<bvernoux> If you want to do like github ones ?
<Johnsel> e.g. a custom msys2 initialization script
<Johnsel> -shell- init
<bvernoux> and today it is not possible to reproduce exactly same time of build as github for windows ?
<Johnsel> and a whole list of other requirements
<Johnsel> no I have not been able to
<bvernoux> time->type
<bvernoux> ha ok very bad
<Johnsel> but I did create a s somewhat reasonable middle ground
<Johnsel> which is enabling RDP lol
<Johnsel> and a loop script
<Johnsel> so we can do manual debugging
<Johnsel> nasty workaround but it at least works
<bvernoux> yes it is better than nothing
<Johnsel> so next time it does something weird we can run things manually
<Johnsel> those instances are slow as fuck though
<Johnsel> but at least it gives us a way forward
<Johnsel> I think vkFFT + ci optimization are good next steps for now though
<Johnsel> you want to work on either of those together or you 1 and 1 another?
<bvernoux> i'm struggling also with glsls
<bvernoux> as some header are not part of VulkanSDK
<bvernoux> on Windows at least
<bvernoux> I do not understand why
<Johnsel> yeah I saw that but my local install got lost when I had to reinstall Windows
<Johnsel> I got it almost back up though and can take a look
<bvernoux> ha yes great
<bvernoux> I have found how to install the headers anyway
<bvernoux> as they are not present in VulkanSDK they are available with msys2/mingw64
<bvernoux> but the fun is the path is not found ;)
<bvernoux> it will requires a cmake script to add the path for the include dir I think
<Johnsel> hmmmm, we should maybe investigate how these path issue come to be
<bvernoux> if you have any other idea you are welcome
<Johnsel> because it seems that there is an underlying reason why it has so many pathing issues
<Johnsel> there is one setting, one sec
<bvernoux> the solution so far
<bvernoux> pacman -S mingw-w64-x86_64-glslang
<bvernoux> it include the missing includes in the package
<bvernoux> but during build it does not find it
<Johnsel> that has a way to at least inherit path
<Johnsel> we can try that
<bvernoux> I have tested locally on my computer
<bvernoux> so it is not an other issue of the GitHub CI
<bvernoux> for windows
<Johnsel> but let me see if I can replicate it on my end with your pacman
<bvernoux> I reproduce it locally with a real Windows10 + MSYS2+MINGW64
<bvernoux> for more details
<bvernoux> what we need is to find glslang_c_interface.h
<bvernoux> which is in /mingw64/include/glslang/Include/glslang_c_interface.h
<bvernoux> I will check those scripts
<bvernoux> mingw64/share/glslang/glslang-config-version.cmake
<bvernoux> mingw64/share/glslang/glslang-config.cmake
<bvernoux> mingw64/share/glslang/glslang-targets.cmake
<bvernoux> mingw64/share/glslang/glslang-targets-release.cmake
<bvernoux> maybe they add automatically the path during build ...
<bvernoux> to be checked what we are intended to do after install of that package
<bvernoux> the issue is not present on Ubuntu 20.04 LTS
<bvernoux> to be checked I have a doubt ;)
<bvernoux> for Ubuntu it just requires glslang-dev
<bvernoux> to be install with apt-get ...
<bvernoux> But I'm also not sure after the path to include during build will be resolved
<bvernoux> for the famous glslang_c_interface.h
<Johnsel> hmm I/usr/include/glslang/Include
<Johnsel> seems it's not picking up the mingw64 environment
<Johnsel> other packages are found as e.g. -isystem C:/msys64/mingw64/lib/gtkmm-3.0/include
<Johnsel> I think it may be shaderc
<Johnsel> do we define the dependency right in the script?
<Johnsel> in PKGBUILD I mean
<bvernoux> The dependencies related to include and so on
<Johnsel> I think that could be it
<bvernoux> in cmake in worst case
<bvernoux> if we need something findGLSLC
<bvernoux> as it is not present in VulkanSDK so far
<bvernoux> to be checked
<bvernoux> it can be also add in PKGBUILD
<bvernoux> add->added
<Johnsel> I mean it's a windows dependency so better PKGBUILD
<bvernoux> yes
<Johnsel> but adding it does not solve it
<Johnsel> I'm not sure how this build system works
<bvernoux> to be checked if there is not the same issue on Ubuntu
<Johnsel> perhaps the package is just wrong it seems to add -a- path to the -I but not the right path
<bvernoux> I do not know so far
<bvernoux> I'm more a cmake/CI build bad hacker ;)
<Johnsel> I also hate that build() modifies the code
<bvernoux> It is clearly not something where I'm fluent
<Johnsel> well, the repo
<bvernoux> I ahve fixed paste stuff with trial and errors ;)
<bvernoux> especially the CI build which is a nightmare for nothing as I also do not know enough how it is intended to work
<Johnsel> sure that's my approach too but that's fine if you can learn as you go
<Johnsel> though I do know CI
<Johnsel> although the GitHub CI is weird
<Johnsel> though I think it may be in our favor
<Johnsel> msys2 is just a beast
<bvernoux> at least mingw64 build real native stuff not like cywgin crap ;)
<Johnsel> true
<bvernoux> I really hated cygwin crap
<Johnsel> that would be worse
<Johnsel> I would not even work on that
<Johnsel> that's for other people to get headaches from
<bvernoux> Yes me too
<bvernoux> Cygwin is a dead end for anything ;)
<bvernoux> it often used just to rebuild linux stuff on Windows
<bvernoux> Anyway it is very good if you know well CI stuff
<bvernoux> for glscopeclient which become more and more complex it is a must have
<Johnsel> I have to disagree on that it should not have any system dependencies, the gtk stuff is just one big bugfest
<bvernoux> If you are good for CMake stuff it is very good too
<bvernoux> as I really hate it and I'm not good ;)
<Johnsel> unless you mean having CI, that I agree on
<bvernoux> yes I hate gtk too ...
<Johnsel> anyway azonenberg: https://code.videolan.org/videolan/libplacebo interesting find
<Johnsel> very much related project goals
<bvernoux> I saw glscopeclient is intended to run on a RPI4 ?
<Johnsel> and might have some interesting finds for things you or we will run into
<bvernoux> I imagine the nightmare as I doubt the RPI4 GPU is fast enough the same for the A72 is pretty slow ....
<bvernoux> but here the important point is more to have a fast GPU
<Johnsel> yes RPI4 linux so it builds and runs on arm64
<Johnsel> and they dev in a vm on the m1
<bvernoux> I imagine it is more just for a POC ?
<Johnsel> indeed
<Johnsel> not actual use
<bvernoux> as I doubt it will be usable
<bvernoux> ha ok
<Johnsel> that's not the point, it's just to isolate the arm64 specific issues from the arm64 + osx specific issues
<bvernoux> The M1 seems amazing but I hate everything Apple do especially how they lock users with their stuff
<bvernoux> So I will never play with it
<bvernoux> OSX is a clear no go for me
<bvernoux> But anyway it is interesting to have it supported to have more users coming ...
<bvernoux> yes RPI4 is a first very good step to at least build on a ARM64 Linux
<bvernoux> As I imagine the M1 support will be full of traps ;)
<bvernoux> The GPU is a pure blackbox IIRC even worst than what we have with Nvidia or AMD
<Johnsel> also I think we need to set up vulkan layers on windows too
<Johnsel> export LD_LIBRARY_PATH=$VULKAN_SDK/lib${LD_LIBRARY_PATH:+:$LD_LIBRARY_PATH}
<Johnsel> presumably that makes it work on linux
<Johnsel> is there really no cleaner way to do this?
<Johnsel> afaik there is a script you need to run in the vulkan install that sets up all the paths correctly
<Johnsel> anyway if it works it works but it's not very clean lol
<Johnsel> I think it may be an issue in the FindVulkan.cmake
<bvernoux> it's me ;)
<bvernoux> yes it is crap
<bvernoux> it is the official directive to install the Vulkan SDK
<bvernoux> I have not checked if we can do something in a cleaner way as for the CI build it is not very important
<Johnsel> well, I'd argue it is important because you want your CI to be the cleanest way to install so you have a reference
<Johnsel> anyway do we look if Vulkan_glslc_FOUND ?
<Johnsel> and add that target
<Johnsel> The ``glslc`` and ``glslangValidator`` components are provided even
<Johnsel> if not explicitly requested (for backward compatibility).
<Johnsel> Vulkan_glslang_LIBRARY and or Vulkan_shaderc_combined_LIBRARY
<Johnsel> also I half joked about the eew, it's not that bad and we can table it to do once we have it actually building
<Johnsel> I can be a bit brash sometimes so I hope you know it's not meant in a negative way
<Johnsel> hmmm
<Johnsel> # For backward compatibility as `FindVulkan` in previous CMake versions allow to retrieve `glslc`
<Johnsel> # and `glslangValidator` without requesting the corresponding component.
<Johnsel> #237
<Johnsel> bvernoux what the actual fuck....
<Johnsel> e
<benishor> Found Vulkan: /usr/lib/x86_64-linux-gnu/libvulkan.so (found version "1.3.204") missing components: glslc glslangValidator
<benishor> trying to build on ubuntu 22.04
<benishor> how do I get those missing components?
<benishor> not sure where I need to add the include/lib dirs
<benishor> grrrrrrrrrrrrrr
<benishor> scopehal's CMakeLists.txt:
<benishor> # TODO: this needs to come from FindPackage etc
<benishor> /usr/include/glslang/Include/
<benishor> still no workie
<benishor> most likely due to refactoring?
<benishor> either that or my vulkan stuff is too new
<Johnsel> yeah lots of weird things going on
<Johnsel> did you see your CI code working entirely with lain's build?
<Johnsel> I really am getting a headache from this
<Johnsel> oh sorry you are not bvernoux
<benishor> nope, still some nick starting with b
<Johnsel> I don't think right now is a good moment to build as there are lots of things broken and in motion
<benishor> gah
<Johnsel> I'd wait a few hours, maybe days for something reasonable
<benishor> any other way I can get a working version for ubuntu?
<benishor> doesn't need to be the latest
<Johnsel> you could pick an older build or see if it's in a package manager
<Johnsel> or wait
<benishor> I'll wait
<azonenberg> Yeah ok so let me push what i have now. it's functional, but the FFTFilter block lost GPU acceleration of the actual FFT temporarily
<azonenberg> (I deleted the OpenCL and then added Vulkan for the window functions, but not for the actual FFT proper yet)
<azonenberg> plus some more unit tests and bug fixes
<azonenberg> just so we don't get people diverging too far from latest code
<benishor> my current errors are related to vulkan api
<_whitenotifier-7> [scopehal] azonenberg pushed 11 commits to master [+4/-0/±22] https://github.com/glscopeclient/scopehal/compare/a7dff336c106...a363edaa4922
<_whitenotifier-7> [scopehal] azonenberg 8c55d8d - Removed clFFT calls from FFTFilter in preparation for transitioning to Vulkan
<_whitenotifier-7> [scopehal] azonenberg 8f0cac3 - DemoOscilloscope: mark waveforms as modified CPU side
<_whitenotifier-7> [scopehal] azonenberg 9e29487 - TRCImportFilter: update for new shader configuration
<_whitenotifier-7> [scopehal] ... and 8 more commits.
<_whitenotifier-7> [scopehal-apps] azonenberg pushed 7 commits to master [+4/-2/±10] https://github.com/glscopeclient/scopehal-apps/compare/a2a7d5250c46...e28f34d0aa46
<benishor> let's see if this fixes it
<_whitenotifier-7> [scopehal-apps] azonenberg fbd79e9 - Refactoring: renamed a bunch of tests. Added initial (incomplete) skeleton test case for FFT.
<_whitenotifier-7> [scopehal-apps] azonenberg 6a81aad - Added --nogpufilter argument
<_whitenotifier-7> [scopehal-apps] azonenberg 7d77a94 - Refactoring: Filters test uses common function for checking results
<_whitenotifier-7> [scopehal-apps] ... and 4 more commits.
<azonenberg> this also moves vkFFT to a submodule vs having it come from the system
<azonenberg> so one less external dependency to rely on
<azonenberg> Running out to do a quick errand, i'll be back shortly. Next step later today will be merging lain's arm64 build fixes
<azonenberg> at that point we should compile on arm64 but there will still be lots of fixes to actually run on pi4 or m1
<benishor> still getting errors related to vulkan
<Johnsel> which vulkan sdk did you install?
<Johnsel> VulkanSDK-1.3.224.1 is the one that ought to work
bvernoux has quit [Ping timeout: 244 seconds]
<_whitenotifier-7> [scopehal] azonenberg pushed 2 commits to master [+0/-0/±60] https://github.com/glscopeclient/scopehal/compare/a363edaa4922...02dd69218fa1
<_whitenotifier-7> [scopehal] azonenberg 02dd692 - Merge remote-tracking branch 'origin/arm64'
<_whitenotifier-7> [scopehal] azonenberg closed pull request #675: Add #ifdefs around x86-specific code. - https://github.com/glscopeclient/scopehal/pull/675
<_whitenotifier-7> [scopehal] azonenberg closed issue #455: Add #ifdef guards to disable all AVX optimizations on non-x86 platforms - https://github.com/glscopeclient/scopehal/issues/455
<_whitenotifier-7> [scopehal-apps] azonenberg pushed 3 commits to master [+0/-0/±14] https://github.com/glscopeclient/scopehal-apps/compare/e28f34d0aa46...759b7b81d86a
<_whitenotifier-7> [scopehal-apps] azonenberg 759b7b8 - Merge remote-tracking branch 'origin/arm64'
<_whitenotifier-7> [scopehal-apps] azonenberg closed pull request #483: Building on arm64 - https://github.com/glscopeclient/scopehal-apps/pull/483
<Johnsel> azonenberg: something interesting happened with that -fsigned-char addition
<Johnsel> it fixed the windows ci
<Johnsel> without my fix for it it still ran
<azonenberg> interesting. i knew we needed fsigned-char for arm64
<azonenberg> but i was not expecting it to matter on windows
<Johnsel> is that weird or is that weird
<Johnsel> the weirdest is that it is not always necessary because it can be worked around with my fixes to pin it to a certain shell
<Johnsel> or at least it seems that that is also a fix, given that the rest of the code is broken I haven't seen it applied to bvernoux' changes and work
<Johnsel> but there is definitely something fucky going on
<Johnsel> because I don't understand why this is became a problem when it did
<Johnsel> I will be returning to my own projects for a bit again because I keep saying to myself to do that but end up here anyway
<Johnsel> I'll be back in a couple days though feel free to ping me if there is a need
Johnsel has quit [Ping timeout: 268 seconds]
<azonenberg> oh boy
Johnsel has joined #scopehal
<azonenberg> welp, we're now importing vkfft as a submodule from a local fork
<azonenberg> because upstream has a thread safety bug that causes use-after-frees
Johnsel has quit [Ping timeout: 252 seconds]
<azonenberg> ok i got vkfft working, still have to GPU the normalization of the output
<azonenberg> but that should be quick