KaiOS screen mirroring new method
According to commits on kaiscr NotABug Git repository about 9 months after I had done my first commit in the repository, I started another serious research for finding a better and high performance way for mirroring the screen of a KaiOS device. Current
kailive.py script works by taking multiple screenshots from the screen using
kaiscr.py and can go up to 10-12 FPS. However most of times, the FPS rate is under 10 and forces high resource usage to the device making many heavy apps slow if you want to run the app and mirror the screen at the same time.
My guess was that, the problem is with encoding raw image to PNG on the device for each frame because screenshot file type is PNG. According to what fabrice said in #b2g:mozilla.org(A Matrix room), it should also be able to give JPG instead of PNG but that doesn't solve the problem.
If I was correct, I should be able to achieve good performance by somehow taking raw screenshots. My early researches, shortly after I had done with
kailive.py, was more focused on finding a low level way using Android libraries or even the Linux kernel to read raw pixels from GPU or some buffer somewhere. But I had no such a low level experience so I failed finding a way. I tried reading from
/dev/graphics/fb0, the Framebuffer, using fbcat, too. But the result was a blank image(all black) with the resolution equal to screen's resolution. And I had other evidences leading me to conclude the fact that Framebuffer is not used for displaying stuff on these devices, like Framebuffer is slow and my bananaphone is capable of running high FPS games(even 30 is high for FPS as far as I know).
I had left this for a long time without any further research, till one week ago(or two). This time my approach was researching a higher level of the system, Gecko. Fortunately after “grepping” and tracing the source code of Gecko for 8110(available on Nokia website at the time of writing this post) I had found how does the System app take screenshots and output them as PNG.
It draws the whole window on a canvas, saves the content of that canvas to PNG and sends it through debugger-socket. Soon after that I found a way to export raw RGB data rather than PNG and sent it to a TCP server listening on my laptop. But did I succeed? The answer is No!
First I tried 30 fps but neither the device was fast nor I got the wanted FPS rate. The same goes for 20 fps...
This means that my guess was simply wrong. The only thing which I have not tried is
captureStream() but I have no hope it will work as expected.
I have written a short report with code snippet in the wiki which you can read here: https://wiki.bananahackers.net/en/development/screen-mirroring#taking-screenshots-without-encoding-and-decoding-to-or-from-png