00:36
<
Johnsel >
I'm happy to see what it does on Windows but I think there's still some discrepancy between my changes and bvernoux'
00:52
Degi_ has joined #scopehal
00:52
Degi has quit [Ping timeout: 244 seconds]
00:52
Degi_ is now known as Degi
01:22
<
Johnsel >
azonenberg glslang_c_interface.h: No such file or directory
01:22
<
Johnsel >
known issue to you?
01:23
<
azonenberg >
Johnsel: that's one of the files pulled in as a dependency by vkFFT
01:23
<
azonenberg >
then make install
01:23
<
azonenberg >
another case of docs not being up to date w/ the bleeding edge code
01:24
<
Johnsel >
buckle up guys, we're on the bleeding edge
01:24
<
Johnsel >
and gals!
01:24
<
azonenberg >
(I also have some un-pushed dev work that might help or break some of this)
01:24
<
azonenberg >
and then lain has arm64 porting work to merge soon too
01:24
<
azonenberg >
things are gonna be in flux for a week or two while some of this stabilizes
01:26
<
Johnsel >
how's that going? and is it useful to spin up a ci instance and see if we can get feedback on the ci process there too while you work?
01:26
<
Johnsel >
because it would be annoying if you end up thinking up things that are neigh impossible to automate
01:27
<
Johnsel >
oh no nvm
01:27
<
Johnsel >
that's not an actual m1 box
02:47
<
azonenberg >
So we're starting by testing an arm64 linux VM on an m1
02:47
<
azonenberg >
and a pi4
02:48
<
azonenberg >
it compiles under that platform but we cant run it yet as we don't have the vulkan renderer up
02:48
<
azonenberg >
and none of those platforms have gl 4.3
02:48
<
azonenberg >
so next step will be getting the vulkan renderer up
02:48
<
azonenberg >
once we have vulkan + arm64 linux working, we'll look at osx specific porting
02:49
<
Johnsel >
keep in mind there's a good chance vulkan in that vm is a no-go
02:49
<
Johnsel >
I don't think there is a proper 3d accelerated GPU available in the vm software
02:50
<
Johnsel >
what are you using?
02:51
<
azonenberg >
parallels. vulkan in the vm is not expected to work
02:51
<
azonenberg >
it was just a way to save time getting things to compile vs doing dev on the pi
02:51
<
Johnsel >
ah yeah for sure that makes sense
02:52
<
Johnsel >
I'm interested to see how it'll perform on the rpi
02:52
<
Johnsel >
are you also having an issue with the GL end of things though?
02:55
<
azonenberg >
At this point, in the VM
02:55
<
azonenberg >
it compiles, and it runs to the point of displaying the GTK chrome with no WaveformArea's
02:55
<
azonenberg >
if you attempt to create a WaveformArea, it will most likely choke because it tries to create a GL 4.3 context
02:56
<
azonenberg >
(this is with un-merged dev code lain is working on)
02:56
<
azonenberg >
The plan is to merge those changes fairly soon, it's basically just #ifdef
__x86_64__ around all of the AVX code
02:57
<
azonenberg >
then she'll switch dev to a linux x86 box with an nvidia card (i.e. a fully supported platform today) and transition the renderer to vulkan
02:57
<
azonenberg >
Once that's done, keeping GL 2.x for the final compositing pass, we'll try to get that running on a pi
02:57
<
azonenberg >
fingers crossed it will work out of the box
02:58
<
azonenberg >
after any bugs we encounter there are fixed, next step is going to be trying to build it on m1 natively with moltenvk
02:58
<
azonenberg >
and see what happens
02:58
<
azonenberg >
In parallel with that, my focus is on replacing all of the OpenCL accelerated filters with Vulkan implementations
02:59
<
azonenberg >
writing new accelerated filters that were previously not accelerated (e.g. sinc upsampling)
02:59
<
azonenberg >
and preparing to completely stop using clFFT and FFTS
02:59
<
Johnsel >
I see, well good luck to you both.
03:01
<
azonenberg >
Yeah. i expect there will be rough edges and snags
03:01
<
azonenberg >
and there will likely be breakage and issues where the CI build and/or documnetation are not keeping up with the latest code
03:04
<
Johnsel >
It would be useful to at least log changes in such a way that they can be easily replicated
03:48
Johnsel has quit [Remote host closed the connection]
07:41
bvernoux has joined #scopehal
07:41
<
bvernoux >
In latest glscopeclient we have build error because of
07:41
<
bvernoux >
scopehal-apps/lib/scopehal/VulkanInit.cpp:36:10: fatal error: glslang_c_interface.h: No such file or directory
07:41
<
bvernoux >
I'm searching in the VulkanSDK but that include is not available
07:42
<
bvernoux >
Which it seems shall be included in future VulkanSDK
07:42
<
bvernoux >
To be confirmed
07:42
<
azonenberg >
bvernoux: that should be installed in the SD
07:42
<
azonenberg >
glslang-dev package
07:43
<
bvernoux >
but it is not in latest official version
07:43
<
bvernoux >
ha really ?
07:43
<
azonenberg >
huh, windows vs linux?
07:43
<
bvernoux >
you think it is an option
07:43
<
azonenberg >
i have it, i know that for sure
07:43
<
azonenberg >
but i thought it was installed as part of the sdk
07:43
<
azonenberg >
it may not be on windows
07:43
<
bvernoux >
yes it is not today
07:43
<
bvernoux >
maybe it is just an option to set
07:43
<
azonenberg >
there's a lot more stuff coming in the next few days, i have some un-pushed stuff that depends on vkFFT
07:43
<
bvernoux >
strange as we are using exactly same option for linux/windows in the CI
07:44
<
azonenberg >
tl;dr we have to pull in the glsl compiler even though we're compiling shaders from glsl to spir-v at compile time
07:44
<
azonenberg >
because vkFFT does JIT generation of shader code for each size of fft
07:44
<
bvernoux >
I have removed glslang-dev from Ubuntu in CI
07:44
<
bvernoux >
as I was sure it was part of VulkanSDK ;)
07:45
<
bvernoux >
so that explain the CI build also fail for Linux
07:46
<
azonenberg >
Yeah. also i had some code previously that assumed vkFFT was installed systemwide, idk if i ever pushed it or if it's an intermediate commit in my local tree
07:46
<
bvernoux >
so it could be a simple fix
07:46
<
azonenberg >
but i'm now including vkFFT as a submoudule
07:46
<
azonenberg >
it's a single header file basically
07:47
<
bvernoux >
let wait for your final stuff on vkFTT before to fix it so
07:47
<
bvernoux >
I will try on my fork anyway
07:48
<
azonenberg >
Yeah. I can push now if you want to play with it, althoguh it's incomplete. it does work but doesnt actually use vkFFT for anything
07:48
<
azonenberg >
only the FFT filter itself (not de-embed, channel emulation, spectrogram, or anythign else that does FFTs) is affected so far
07:48
<
azonenberg >
it creates a vkFFT context but does not actually use it
07:48
<
azonenberg >
we use Vulkan to do the window function
07:48
<
azonenberg >
then pass the windowed data to FFTS for postprocessing
07:48
<
azonenberg >
then normalize in software
07:49
<
azonenberg >
the next step is going to be doing the actual FFT in vulkan which is coming later today after i do some other stuff
07:51
<
bvernoux >
there is no hurry I let you finish the stuff
07:53
<
azonenberg >
Yeah i'm bouncing between this, probe stuff, and some microscope sample prep
08:15
<
bvernoux >
Has been created on 24 Dec 2019
08:15
<
bvernoux >
it is pretty old but it is not part of VulkanSDK which is very strange
08:16
<
bvernoux >
You could be interested to install optional Volk Header, source and library too as it is an option for VulkanSDK
08:17
<
bvernoux >
and also later SDL2 libraries and headers
08:18
<
bvernoux >
potentially also Vulkan Memory Allocator library
08:19
<
bvernoux >
could be interesting
08:19
<
bvernoux >
TO avoid other external dependencies as everything can be easy installed with VulkanSDK
08:55
<
azonenberg >
TFW i write a unit test for my WIP GPU fft filter and instead end up catching what appears to be a bug in my AVX accelerated Blackman-Harris window function
09:36
<
d1b2 >
<Darius> heh
10:55
<
azonenberg >
So i did find and fix a small bug. what i'm now seeing is a
*slight* error between the two
10:56
<
azonenberg >
up to about 2.3%
10:56
<
azonenberg >
i suspect it's probably the AVX accelerated transcendental library i'm using
10:56
<
azonenberg >
Which i had never actually evaluated for numerical accuracy before
10:59
<
azonenberg >
hmmmmm
10:59
<
azonenberg >
very, very interesting
10:59
<
azonenberg >
the error is sinusoidal
10:59
<
azonenberg >
that seems to point to an actual bug
15:53
Fridtjof has quit [Ping timeout: 264 seconds]
15:53
Stary has quit [Ping timeout: 268 seconds]
16:17
Stary has joined #scopehal
16:22
Fridtjof has joined #scopehal
17:07
Johnsel has joined #scopehal
17:07
<
Johnsel >
bvernoux did you see my working build?
17:07
<
Johnsel >
as in, working in the CI
17:08
<
bvernoux >
Do you have fixed latest issue ?
17:08
<
Johnsel >
no that issue is still unfixed
17:08
<
Johnsel >
but check that out
17:08
<
Johnsel >
it does approach the environment variables in a different way from yours so we'll have to decide what to do with that
17:09
<
bvernoux >
if it is simpler it is better ;)
17:09
<
bvernoux >
my variables was very far to be perfect
17:10
<
bvernoux >
I do not understand why do you add each time
17:10
<
bvernoux >
shell: msys2 {0}
17:10
<
bvernoux >
as implicitely it is what is done for each step
17:11
<
bvernoux >
but lot of things are strange with GitHub CI
17:11
<
bvernoux >
Windows:
17:11
<
bvernoux >
runs-on: windows-latest
17:11
<
bvernoux >
shell: msys2 {0}
17:11
<
bvernoux >
defaults:
17:11
<
bvernoux >
it was intended to use by default msys2
17:11
<
bvernoux >
I see you have changed that
17:12
<
Johnsel >
yep and I think that that is not working correctly because that really seems to be the fix
17:12
<
Johnsel >
I don't think you can default to that shell because the first steps will not have that shell exist
17:13
<
Johnsel >
which is probably why it starts up different new shells or loses the reference to it
17:13
<
Johnsel >
i'm not sure, but it really seems to be the fix
17:13
<
bvernoux >
yes it seems your fix is the right one and I prefer when it is exlicit in all case
17:14
<
bvernoux >
you can do a PR to fix that
17:14
<
Johnsel >
I also don't have the shell path issues you had I just have to set the one environment variable to point it to the vulkan sdk
17:14
<
bvernoux >
next step will be to fix VkFFT
17:14
<
Johnsel >
though that is not entirely proven since we have a non-building code right now
17:14
<
Johnsel >
yes I think waiting for that and confirming it works good is best
17:14
<
bvernoux >
yes but in any case it is better to add explicitely which shell is used
17:14
<
Johnsel >
first fix vkFFT and then PR
17:15
<
bvernoux >
GitHUB CI have very strange issues
17:15
<
Johnsel >
because my code is a very old commit
17:15
<
Johnsel >
from the first time vulkan got added, 10 days ago or so
17:15
<
bvernoux >
you see the persistant settings works for windows and do not work for ubuntu-latest
17:15
<
bvernoux >
very strange too
17:15
<
Johnsel >
yes lots of weirdness, the msys2 on windows does not help either
17:15
<
bvernoux >
I have avoided that in ubuntu build and it works but it is not very clear why
17:16
<
bvernoux >
I'm also very far to be an expert of GitHub CI Build ;)
17:16
<
Johnsel >
it's very easy to lose stuff in the CI, either env vars or output or shells
17:16
<
bvernoux >
yes env vars not persistent is really crazy
17:16
<
Johnsel >
I tried spinning up a -local- windows instance + builder
17:16
<
bvernoux >
like the patg
17:16
<
Johnsel >
neigh impossible, it is entirely built to spin up azure cloud vms only
17:16
<
Johnsel >
here check this out
17:17
<
bvernoux >
ha you tell that you do not reproduce that issue when using a local instance ?
17:17
<
bvernoux >
which is intended to be indentical to GitHub CI build stuff ?
17:17
<
Johnsel >
and then runs 30 powershell scripts remotely
17:17
<
Johnsel >
with those parameters
17:18
<
bvernoux >
ha interesting
17:18
<
Johnsel >
those 2 start the process
17:18
<
bvernoux >
next step will be to check how to optimize the build time using cached stuff or prebuilt
17:18
<
Johnsel >
it's all build on Packer
17:18
<
bvernoux >
I do not really know how to do that with GitHub CI
17:18
<
Johnsel >
it's fucking disastrous
17:18
<
Johnsel >
yeah we should definitely do that
17:18
<
Johnsel >
simplest is just to cache files
17:18
<
Johnsel >
and restore the file cache
17:19
<
Johnsel >
we can cache the sdk, previous build
17:19
<
Johnsel >
do msi build in 1 step
17:19
<
Johnsel >
and only do fresh builds on certain times/actions
17:19
<
bvernoux >
yes as the build time is horrible today
17:19
<
Johnsel >
so a simple commit just does an incremental build off of the last build
17:19
<
Johnsel >
that way we can at least go down to sub 30 minute builds hopefully
17:20
<
bvernoux >
It will be clearly a must have as soon as possible
17:20
<
Johnsel >
it will also be possible to spend money on faster instances
17:20
<
Johnsel >
there is a public beta right now
17:20
<
Johnsel >
no GPUs though :(
17:20
<
bvernoux >
Yes Andrew plan to do offline stuff
17:20
<
bvernoux >
to even run stuff on the real hardware supporting fully vulkan ...
17:20
<
Johnsel >
Yes that is mostly based on my ideas
17:20
<
Johnsel >
though the windows runner is definitely a problem for that
17:20
<
bvernoux >
as so far GitHub instance have no GPU and other stuff especially the free stuff
17:21
<
bvernoux >
ha ok it was your idea
17:21
<
Johnsel >
although it is not impossible to build a windows runner but it will be a lot of work
17:21
<
Johnsel >
the other platforms it is easier to run local builders
17:21
<
Johnsel >
at least m1 was very easy
17:21
<
bvernoux >
ha interesting
17:21
<
Johnsel >
maybe linux has some weird requirements but I would expect not
17:21
<
bvernoux >
so today the windows local builder is broken ?
17:21
<
Johnsel >
the windows runner has all sorts of expectations of the system having things installed in certain places
17:22
<
bvernoux >
If you want to do like github ones ?
17:22
<
Johnsel >
e.g. a custom msys2 initialization script
17:22
<
Johnsel >
-shell- init
17:22
<
bvernoux >
and today it is not possible to reproduce exactly same time of build as github for windows ?
17:22
<
Johnsel >
and a whole list of other requirements
17:22
<
Johnsel >
no I have not been able to
17:22
<
bvernoux >
time->type
17:22
<
bvernoux >
ha ok very bad
17:22
<
Johnsel >
but I did create a s somewhat reasonable middle ground
17:23
<
Johnsel >
which is enabling RDP lol
17:23
<
Johnsel >
and a loop script
17:23
<
Johnsel >
so we can do manual debugging
17:23
<
Johnsel >
nasty workaround but it at least works
17:23
<
bvernoux >
yes it is better than nothing
17:23
<
Johnsel >
so next time it does something weird we can run things manually
17:23
<
Johnsel >
those instances are slow as fuck though
17:24
<
Johnsel >
but at least it gives us a way forward
17:24
<
Johnsel >
I think vkFFT + ci optimization are good next steps for now though
17:24
<
Johnsel >
you want to work on either of those together or you 1 and 1 another?
17:24
<
bvernoux >
i'm struggling also with glsls
17:24
<
bvernoux >
as some header are not part of VulkanSDK
17:24
<
bvernoux >
on Windows at least
17:24
<
bvernoux >
I do not understand why
17:25
<
Johnsel >
yeah I saw that but my local install got lost when I had to reinstall Windows
17:25
<
Johnsel >
I got it almost back up though and can take a look
17:25
<
bvernoux >
ha yes great
17:25
<
bvernoux >
I have found how to install the headers anyway
17:25
<
bvernoux >
as they are not present in VulkanSDK they are available with msys2/mingw64
17:25
<
bvernoux >
but the fun is the path is not found ;)
17:26
<
bvernoux >
it will requires a cmake script to add the path for the include dir I think
17:26
<
Johnsel >
hmmmm, we should maybe investigate how these path issue come to be
17:26
<
bvernoux >
if you have any other idea you are welcome
17:26
<
Johnsel >
because it seems that there is an underlying reason why it has so many pathing issues
17:26
<
Johnsel >
there is one setting, one sec
17:27
<
bvernoux >
the solution so far
17:27
<
bvernoux >
pacman -S mingw-w64-x86_64-glslang
17:27
<
bvernoux >
it include the missing includes in the package
17:27
<
bvernoux >
but during build it does not find it
17:27
<
Johnsel >
that has a way to at least inherit path
17:27
<
Johnsel >
we can try that
17:27
<
bvernoux >
I have tested locally on my computer
17:27
<
bvernoux >
so it is not an other issue of the GitHub CI
17:27
<
bvernoux >
for windows
17:28
<
Johnsel >
but let me see if I can replicate it on my end with your pacman
17:28
<
bvernoux >
I reproduce it locally with a real Windows10 + MSYS2+MINGW64
17:28
<
bvernoux >
for more details
17:29
<
bvernoux >
what we need is to find glslang_c_interface.h
17:29
<
bvernoux >
which is in /mingw64/include/glslang/Include/glslang_c_interface.h
17:29
<
bvernoux >
I will check those scripts
17:30
<
bvernoux >
mingw64/share/glslang/glslang-config-version.cmake
17:30
<
bvernoux >
mingw64/share/glslang/glslang-config.cmake
17:30
<
bvernoux >
mingw64/share/glslang/glslang-targets.cmake
17:30
<
bvernoux >
mingw64/share/glslang/glslang-targets-release.cmake
17:30
<
bvernoux >
maybe they add automatically the path during build ...
17:30
<
bvernoux >
to be checked what we are intended to do after install of that package
17:30
<
bvernoux >
the issue is not present on Ubuntu 20.04 LTS
17:31
<
bvernoux >
to be checked I have a doubt ;)
17:31
<
bvernoux >
for Ubuntu it just requires glslang-dev
17:32
<
bvernoux >
to be install with apt-get ...
17:32
<
bvernoux >
But I'm also not sure after the path to include during build will be resolved
17:32
<
bvernoux >
for the famous glslang_c_interface.h
17:33
<
Johnsel >
hmm I/usr/include/glslang/Include
17:33
<
Johnsel >
seems it's not picking up the mingw64 environment
17:34
<
Johnsel >
other packages are found as e.g. -isystem C:/msys64/mingw64/lib/gtkmm-3.0/include
17:35
<
Johnsel >
I think it may be shaderc
17:36
<
Johnsel >
do we define the dependency right in the script?
17:38
<
Johnsel >
in PKGBUILD I mean
17:38
<
bvernoux >
The dependencies related to include and so on
17:38
<
Johnsel >
I think that could be it
17:38
<
bvernoux >
in cmake in worst case
17:38
<
bvernoux >
if we need something findGLSLC
17:38
<
bvernoux >
as it is not present in VulkanSDK so far
17:38
<
bvernoux >
to be checked
17:38
<
bvernoux >
it can be also add in PKGBUILD
17:38
<
bvernoux >
add->added
17:38
<
Johnsel >
I mean it's a windows dependency so better PKGBUILD
17:39
<
Johnsel >
but adding it does not solve it
17:39
<
Johnsel >
I'm not sure how this build system works
17:39
<
bvernoux >
to be checked if there is not the same issue on Ubuntu
17:39
<
Johnsel >
perhaps the package is just wrong it seems to add -a- path to the -I but not the right path
17:40
<
bvernoux >
I do not know so far
17:40
<
bvernoux >
I'm more a cmake/CI build bad hacker ;)
17:40
<
Johnsel >
I also hate that build() modifies the code
17:40
<
bvernoux >
It is clearly not something where I'm fluent
17:40
<
Johnsel >
well, the repo
17:40
<
bvernoux >
I ahve fixed paste stuff with trial and errors ;)
17:41
<
bvernoux >
especially the CI build which is a nightmare for nothing as I also do not know enough how it is intended to work
17:41
<
Johnsel >
sure that's my approach too but that's fine if you can learn as you go
17:41
<
Johnsel >
though I do know CI
17:41
<
Johnsel >
although the GitHub CI is weird
17:41
<
Johnsel >
though I think it may be in our favor
17:41
<
Johnsel >
msys2 is just a beast
17:42
<
bvernoux >
at least mingw64 build real native stuff not like cywgin crap ;)
17:42
<
bvernoux >
I really hated cygwin crap
17:42
<
Johnsel >
that would be worse
17:42
<
Johnsel >
I would not even work on that
17:43
<
Johnsel >
that's for other people to get headaches from
17:43
<
bvernoux >
Yes me too
17:43
<
bvernoux >
Cygwin is a dead end for anything ;)
17:43
<
bvernoux >
it often used just to rebuild linux stuff on Windows
17:44
<
bvernoux >
Anyway it is very good if you know well CI stuff
17:44
<
bvernoux >
for glscopeclient which become more and more complex it is a must have
17:45
<
Johnsel >
I have to disagree on that it should not have any system dependencies, the gtk stuff is just one big bugfest
17:45
<
bvernoux >
If you are good for CMake stuff it is very good too
17:45
<
bvernoux >
as I really hate it and I'm not good ;)
17:45
<
Johnsel >
unless you mean having CI, that I agree on
17:46
<
bvernoux >
yes I hate gtk too ...
17:46
<
Johnsel >
very much related project goals
17:47
<
bvernoux >
I saw glscopeclient is intended to run on a RPI4 ?
17:47
<
Johnsel >
and might have some interesting finds for things you or we will run into
17:47
<
bvernoux >
I imagine the nightmare as I doubt the RPI4 GPU is fast enough the same for the A72 is pretty slow ....
17:47
<
bvernoux >
but here the important point is more to have a fast GPU
17:47
<
Johnsel >
yes RPI4 linux so it builds and runs on arm64
17:48
<
Johnsel >
and they dev in a vm on the m1
17:48
<
bvernoux >
I imagine it is more just for a POC ?
17:48
<
Johnsel >
not actual use
17:48
<
bvernoux >
as I doubt it will be usable
17:49
<
Johnsel >
that's not the point, it's just to isolate the arm64 specific issues from the arm64 + osx specific issues
17:49
<
bvernoux >
The M1 seems amazing but I hate everything Apple do especially how they lock users with their stuff
17:49
<
bvernoux >
So I will never play with it
17:49
<
bvernoux >
OSX is a clear no go for me
17:50
<
bvernoux >
But anyway it is interesting to have it supported to have more users coming ...
17:50
<
bvernoux >
yes RPI4 is a first very good step to at least build on a ARM64 Linux
17:51
<
bvernoux >
As I imagine the M1 support will be full of traps ;)
17:51
<
bvernoux >
The GPU is a pure blackbox IIRC even worst than what we have with Nvidia or AMD
17:52
<
Johnsel >
also I think we need to set up vulkan layers on windows too
17:53
<
Johnsel >
export LD_LIBRARY_PATH=$VULKAN_SDK/lib${LD_LIBRARY_PATH:+:$LD_LIBRARY_PATH}
17:53
<
Johnsel >
presumably that makes it work on linux
17:54
<
Johnsel >
is there really no cleaner way to do this?
17:54
<
Johnsel >
afaik there is a script you need to run in the vulkan install that sets up all the paths correctly
18:01
<
Johnsel >
anyway if it works it works but it's not very clean lol
18:01
<
Johnsel >
I think it may be an issue in the FindVulkan.cmake
18:01
<
bvernoux >
it's me ;)
18:01
<
bvernoux >
yes it is crap
18:01
<
bvernoux >
it is the official directive to install the Vulkan SDK
18:02
<
bvernoux >
I have not checked if we can do something in a cleaner way as for the CI build it is not very important
18:02
<
Johnsel >
well, I'd argue it is important because you want your CI to be the cleanest way to install so you have a reference
18:03
<
Johnsel >
anyway do we look if Vulkan_glslc_FOUND ?
18:03
<
Johnsel >
and add that target
18:03
<
Johnsel >
The ``glslc`` and ``glslangValidator`` components are provided even
18:03
<
Johnsel >
if not explicitly requested (for backward compatibility).
18:03
<
Johnsel >
Vulkan_glslang_LIBRARY and or Vulkan_shaderc_combined_LIBRARY
18:06
<
Johnsel >
also I half joked about the eew, it's not that bad and we can table it to do once we have it actually building
18:07
<
Johnsel >
I can be a bit brash sometimes so I hope you know it's not meant in a negative way
18:09
<
Johnsel >
# For backward compatibility as `FindVulkan` in previous CMake versions allow to retrieve `glslc`
18:09
<
Johnsel >
# and `glslangValidator` without requesting the corresponding component.
18:13
<
Johnsel >
bvernoux what the actual fuck....
18:36
<
benishor >
Found Vulkan: /usr/lib/x86_64-linux-gnu/libvulkan.so (found version "1.3.204") missing components: glslc glslangValidator
18:37
<
benishor >
trying to build on ubuntu 22.04
18:37
<
benishor >
how do I get those missing components?
18:56
<
benishor >
not sure where I need to add the include/lib dirs
18:59
<
benishor >
grrrrrrrrrrrrrr
18:59
<
benishor >
scopehal's CMakeLists.txt:
19:00
<
benishor >
# TODO: this needs to come from FindPackage etc
19:00
<
benishor >
/usr/include/glslang/Include/
19:02
<
benishor >
still no workie
19:02
<
benishor >
most likely due to refactoring?
19:03
<
benishor >
either that or my vulkan stuff is too new
19:04
<
Johnsel >
yeah lots of weird things going on
19:04
<
Johnsel >
did you see your CI code working entirely with lain's build?
19:04
<
Johnsel >
I really am getting a headache from this
19:05
<
Johnsel >
oh sorry you are not bvernoux
19:05
<
benishor >
nope, still some nick starting with b
19:05
<
Johnsel >
I don't think right now is a good moment to build as there are lots of things broken and in motion
19:06
<
Johnsel >
I'd wait a few hours, maybe days for something reasonable
19:06
<
benishor >
any other way I can get a working version for ubuntu?
19:06
<
benishor >
doesn't need to be the latest
19:06
<
Johnsel >
you could pick an older build or see if it's in a package manager
19:07
<
benishor >
I'll wait
19:07
<
azonenberg >
Yeah ok so let me push what i have now. it's functional, but the FFTFilter block lost GPU acceleration of the actual FFT temporarily
19:07
<
azonenberg >
(I deleted the OpenCL and then added Vulkan for the window functions, but not for the actual FFT proper yet)
19:07
<
azonenberg >
plus some more unit tests and bug fixes
19:08
<
azonenberg >
just so we don't get people diverging too far from latest code
19:08
<
benishor >
my current errors are related to vulkan api
19:08
<
_whitenotifier-7 >
[scopehal] azonenberg 8c55d8d - Removed clFFT calls from FFTFilter in preparation for transitioning to Vulkan
19:08
<
_whitenotifier-7 >
[scopehal] azonenberg 8f0cac3 - DemoOscilloscope: mark waveforms as modified CPU side
19:08
<
_whitenotifier-7 >
[scopehal] azonenberg 9e29487 - TRCImportFilter: update for new shader configuration
19:08
<
_whitenotifier-7 >
[scopehal] ... and 8 more commits.
19:08
<
benishor >
let's see if this fixes it
19:08
<
_whitenotifier-7 >
[scopehal-apps] azonenberg fbd79e9 - Refactoring: renamed a bunch of tests. Added initial (incomplete) skeleton test case for FFT.
19:08
<
_whitenotifier-7 >
[scopehal-apps] azonenberg 6a81aad - Added --nogpufilter argument
19:08
<
_whitenotifier-7 >
[scopehal-apps] azonenberg 7d77a94 - Refactoring: Filters test uses common function for checking results
19:08
<
_whitenotifier-7 >
[scopehal-apps] ... and 4 more commits.
19:08
<
azonenberg >
this also moves vkFFT to a submodule vs having it come from the system
19:08
<
azonenberg >
so one less external dependency to rely on
19:09
<
azonenberg >
Running out to do a quick errand, i'll be back shortly. Next step later today will be merging lain's arm64 build fixes
19:09
<
azonenberg >
at that point we should compile on arm64 but there will still be lots of fixes to actually run on pi4 or m1
19:10
<
benishor >
still getting errors related to vulkan
19:10
<
Johnsel >
which vulkan sdk did you install?
19:11
<
Johnsel >
VulkanSDK-1.3.224.1 is the one that ought to work
19:57
bvernoux has quit [Ping timeout: 244 seconds]
20:02
<
_whitenotifier-7 >
[scopehal] azonenberg 02dd692 - Merge remote-tracking branch 'origin/arm64'
20:04
<
_whitenotifier-7 >
[scopehal-apps] azonenberg 759b7b8 - Merge remote-tracking branch 'origin/arm64'
20:05
<
Johnsel >
azonenberg: something interesting happened with that -fsigned-char addition
20:05
<
Johnsel >
it fixed the windows ci
20:06
<
Johnsel >
without my fix for it it still ran
20:06
<
azonenberg >
interesting. i knew we needed fsigned-char for arm64
20:06
<
azonenberg >
but i was not expecting it to matter on windows
20:06
<
Johnsel >
is that weird or is that weird
20:07
<
Johnsel >
the weirdest is that it is not always necessary because it can be worked around with my fixes to pin it to a certain shell
20:07
<
Johnsel >
or at least it seems that that is also a fix, given that the rest of the code is broken I haven't seen it applied to bvernoux' changes and work
20:07
<
Johnsel >
but there is definitely something fucky going on
20:08
<
Johnsel >
because I don't understand why this is became a problem when it did
20:10
<
Johnsel >
I will be returning to my own projects for a bit again because I keep saying to myself to do that but end up here anyway
20:10
<
Johnsel >
I'll be back in a couple days though feel free to ping me if there is a need
22:59
Johnsel has quit [Ping timeout: 268 seconds]
22:59
<
azonenberg >
oh boy
23:04
Johnsel has joined #scopehal
23:15
<
azonenberg >
welp, we're now importing vkfft as a submodule from a local fork
23:16
<
azonenberg >
because upstream has a thread safety bug that causes use-after-frees
23:18
Johnsel has quit [Ping timeout: 252 seconds]
23:46
<
azonenberg >
ok i got vkfft working, still have to GPU the normalization of the output
23:46
<
azonenberg >
but that should be quick