I have Samsung S4 Mini (i9195) running close to mainline 6.19. Good. It took me more debugging that it should.
I also have patches that allowed me to trace execution from assembly to init/main.c to pagetable setup in C, using pixels on framebuffer and then with hacked console.
Pavel Machek
Pavel Machek
pavel_at_social.kernel.org@momostr.pink
npub1hquy...zr0f
Riding horses, hacking computers, phones and smartwatch.
Self-built lk2nd simply works on S4 mini. Wow. That was easy :-).
Libcamera "simple" example usage has three files, 600 lines total. Good news is that it works, bad news is that it stops working almost as soon as I modify it... and not in a way that is easy to debug (segfault, debug trap, backtrace pointing to source line with "}"). I integrated it with SDL, and that was easy, got buffers mapped and translated to format that can be displayed. But when I try to close front camera and open back one, or to change resolution, bad things (tm) happen. Fact that empty build takes 15 seconds also does not help.
My code is at
. Press "Snap" button to demonstrate the problem.
If you can help debugging it, or can get it to work, or know better example to start from, please let me know. Would not mind boosts. You may get libcamera based application to take photos / record videos on phones in future if we can get this to work.

GitLab
mcam: Cleanups, still crashes when pressing "Snap" (456a19f6) ยท Commits ยท tui / libcamera ยท GitLab
My hacks for libcamera, most likely related to PinePhone.
Front camera on OnePlus 6 now "works" with libmegapixels. Good. More work ahead.
I'm experimenting with sensors on #librem5 (but should work on #linuxphones after tweaks). So far I have compass and step counter. github.com:pavelmachek/Espruino.git
#phonecamera #linuxphone #librem5
Librem 5 camera/kernel can do three possible resolutions, ~1024x768 @ ~24fps, ~2048x.. @ ~31 fps and ~4096x.. @ ~15fps. Debayering is actually easier and better quality if we downscale at the same time, and that allows best framerate, so we do that (2048x.. resolution).
ARM has problems with cache coherency w.r.t. DMA, and kernel solution is to simply disable cache on DMAbufs for userspace, which means accessing video data is 10x slower than it should be on the CPU. Which means debayering on GPU is attractive, and that's what we do. (gold.frag). GPU can do more image signal processing functions easily, too, so we do some of that.
Unfortunately, we hit the same uncached memory problem at the GPU output. So we use separate thread to copy. All this unfortunately does not fit on one core, so we need two threads, one controlling GPU debayer on frame n+1, while the other one copies video data from frame n. (heart.c). We save resulting RGBA data to ramdisk. This all costs maybe 80% of one core.
From there, Python scripts can pick them up: ucam.py displaying the viewfinder and mpegize.py handling the video encoding via gstreamer. There's basically 0% cpu left, but I can encode ~1024x.. video. Unfortunately that's without audio and with viewfinder at 1fps. Plus, combination of C + Python is great for prototyping, but may not be that great for performance.
Code is here:
.
At this point I'd like viewfinder functionality merged into the rest of GPU processing. Ideally, I'd like to have a bitmap with GUI elements, combine it with scaled RGBA data, and rendering it to screen. I know SDL and Gtk, SDL looked like better match, but I could not get SDL and GPU debayering to work in single process (template SDL code is here
).
If you can integrate main.c and heart.c, that would be welcome. If you have example code that combines SDL with processing on GPU, that would be nice, too. If you know someone who can do GPU/SDL, boost would not be bad, I guess.
@npub1mkmz...c0eu
@Doylov Vasiliy
@npub1ws3m...halv
@Martijn Braam

GitLab
icam ยท master ยท tui / Tui ยท GitLab
GitLab.com

GitLab
sdl/main.c ยท master ยท tui / debayer-gpu ยท GitLab
GitLab.com
edit: Thanks for all the heroes that chimed in. In meantime, I got help from entity that shall not be named, and currently have something fast enough. And I have great human experts on line, with patches to test. Thanks again! :-)
Can you program GPUs and do you want to become a HERO? #linuxphone
community needs your help.
We are trying record video, and have most pieces working, but one is
missing: fast enough debayering. That means about 23MB/sec on #librem5.
Debayering is not hard; camera images have subpixels split on two
lines, which need to be corrected. They also use different color
representation, but that's fixable by some table lookup and two matrix
multiplies.
Librem 5 has Vivante GPU, 4 in-order CPU cores and 3GB RAM. My feeling
is that it should be fast enough for that. If task is for some reason
impossible, that would be good to know, too.
Image data looks like this
RGRGRG...
xBxBxB...
.........
.........
Task is to turn that into usual rgbrgb.... format. rgb = RGB * color
matrix, with table lookups for better quality. I can fix that once I
get an example.
I'm looking for example code (#pinephone would work, too), reasons it
can not be done... and boosts if you have friends that can program
GPUs. #gpu #opensource
So everyone is taking pictures with #oneplus6 #mobilelinux #postmarketos . Can you get me some metadata? Which sensor works, resolution, what format does it use? (BGGR8? BGGR10? Something better?) I am using Librem 5, but that only has 8-bit bayer working (and many other problems).