Lair Of The Multimedia Guru

2007-10-06

F50fd part2

The image stabilization, well ive done more testing, 10 images without IS, 10 at continous mode, 10 at shot only mode, all at the wide angle side at 1/4 sec 12mp, quick summary the IS does not help, find the 2 best looking croped images for each of the 3 cases in the table below:

IS:off
IS:continous
IS:shot only

Random low iso images








and in case anyone is wondering how i reached a image number of above 6000 in a few days testing, well thats what happens if you switch sd cards around randomly one thing i learned from that though was the FD50fd had no problems displaying images made with the IXUS though it did not play my naively made mjpeg in AVI :( it did not even play its own video after it was remuxed but i didnt investigate this any further …

Filed under: Cameras,Pictures — Michael @ 22:32

Fujifilm F50fd vs. Canon digital ixus 30

I wanted a small compact digicam which is better at taking pictures in bad light than my ixus30, has full manual controls, a wider angle lens, … well no such camera exists currently. So i decided to try the F50fd, it has aperture and shutter priority modes, can take images at up to ISO 6400 and has a image stabilizer.

Size and weight: Well its a bit bigger and heavier than my ixus30 but it still is small enough to fit in my pocket

The image stabilizer: Well, what can i say, i took many images at the telezoom side at 1/5sec shutter with it in continous mode, with it in shot only mode and with it disabled, no single image was useable. Repeating the test at the wide angle side of the zoom at 1/5sec some images where ok others where not, the stabilizer again didnt make much of a difference, sadly ive lost patience and didnt take more than about 5 images each with stabi in mode 1 in mode 2 and without stabi so i dont have statistically significant data for that part. What i can say though is that the stabi did not help me take a single useable image at a shutter speed at which i wouldnt be able to take a good one with a few tries without the stabi

The manual control: Well it does work, and even quite well, you can change the aperture with a single button press, the exposure correcton as well can be changed with a single button press (after you switched the buttons into the right mode which needs 1 button press). Very sadly to change the ISO you have to press at least 3 buttons (4 after the camera is turned on). On the ixus30 you need 1 button press to cange ISO and exposure correction if you are in the correct menu (yes you can take images without leaving the menu on the ixus this doesnt work with the f50 ISO menu), you need 2 button presses to change between the ISO and exposure menues and 2 to reach the exposure and 4 for the ISO menu after power on on the ixus, so in summary the manual control could be made available more directly on both cameras.

Arbitrary limitations: There where a few surprises for me as i was playing around with the camera, first the aperture priority mode is limited to a shutter of 1/4sec at the long side, the shutter priority mode though can be used with up to 1sec. Longer exposures (up to 8 sec) are possible but only at ISO 100

Deleting images: I was mildly annoyed that i had to press 3 buttons to delete an image, but what was much more annoying was that the deletion comes with a nice animaton which you cant disable and which you have to let finish before you can do anything else. That is, its not hard to press the 3 button sequence a few times while the animation plays sadly it has no effect

Auto focus: This one does work better than on the ixus in low light with the focus assist lamp disabled

High ISO image quality: Well i hoped that the F50 would be a lot better at taking images at high ISO, sadly the difference is not that large, but see yourself:
F50 ISO3200 1/4sec f/2.8 Now how does one compare this to the IXUS30 which just has ISO400 well one (mis)uses the exposure correction to get the exposure one wants and then fixes the brightness/contrast in software
IXUS30 ISO400 -2EV 1/5sec f/2.8 (way too dark yes)
after -vf ow=7:8:16 and gimp to fix levels So which looks better? teh F50 one of course, lets compare it to a longer exposure of the IXUS
ISO400 -1EV 1/2sec f/2.8
fixed up levels in gimp
and with -vf ow=7:8:16 With the 2x longer exposure i would say the IXUS is at least as good as the F50 (with the shorter exposure). And there would be thumbnail images if wordpress would generate them or i knew how to upload several images at once instead of each one individually …

ISO values: well the more i played with the F50 the more i noticed that its ISO values dont match the IXUS ones, that is the IXUS30 at ISO400 seems to match approximately ISO600 on the F50 (matching here means same shutter, aperture and equally bright final images)

Colors: well the F50 is about as (in)accurate as the IXUS30 though they are different

Will i keep the F50? Probably not the improvement is too small over the IXUS30 also i dont like the heavy noise reduction the camera does, it becomes noticeable already at ISO400, the IXUS30 doesnt do this though more recent IXUS versions seem to also follow the trend of butchering images with noise reduction

Ill upload more pictures tommorow …

Filed under: Cameras,Pictures — Michael @ 03:44

2007-09-30

Flowers and bees

Pictures from my balcony to fullfill mikes request of wasting more bandwidth ;)



I bought one of these a few years ago and now my balcony is full of them … so I for one welcome our new plant overlords and iam sorry that the ixus overexposed your mighty petals ;)

if only thouse bees would wait until i got out of the sun to be able to see anything on the display of the ixus30 then navigated through 2 menus to set the EV and went back …

This one would have been nice if it where focused properly

properly focused this time

Fluffy ball ;) is moving too fast for the 1/200 shutter, why did ixus choose f/5.6 here instead of f/2.8 with a shorter exposure? and why doesnt it support aperture priority :(


Filed under: Nature,Pictures — Michael @ 02:12

Recent changes in ffmpeg

Maybe you wonder what is currently happening in ffmpeg development, no nothing special i just thought to summarize what i remember and maybe i or someone else could/should write some weekly whats new report from now on? :)

  • heaps of flac encoder optimizations by loren
  • a H.264 PAFF patch, so dont give up hope yet, maybe we soon will have that in svn
  • amv audio and video support
  • IPv6 related fixes by ronald
  • some minor h.264 optimizations by andreas
  • some experimental h.264 multithreading code which splits entropy coding into its own thread by andreas, though dont expect that in svn in the near future
  • a MMS patch ohh darn i just realize i forgot about that one :(
  • various improvements and bugfixes in the rt(s)p code by the luca brothers ;) (sorry couldnt resist)
Filed under: FFmpeg — Michael @ 00:48

2007-09-28

Potato goulash

To continue the late series of totally off topic postings and to waste some bandwidth with big images ive recorded what ive cooked today (no still with my canon ixus 30 as ive got no new cam yet …)

First an onion and the mandatory safety goggles one needs to work with onions

After peeling and washing the thing:

After choping it up in cubes and realizing that its a little more than i expected: (i wish i had an image stabilizer or would have checked the image or have taken more as both i took turned out blured)

A choped up halfed and fried onon:

1kg potatos, well almost:

Iam hungry, better more than too little :)

Peeled, washed and choped up potatos in the cauldron with the onions …

Added water, putting it on the fire. And now as i see it cursing that i thought the ixus 30 could take images at iso100 a meter under a 100w light bulb, besides that why is the damn thing not displying the shutter speed, if i knew it was 1/8 sec i would have done something about it …

Sausage (extrawurst in that case), lord ixus though focused on the wrong end and yes i did try to focus on the other side, dont ask why i didnt check half of the images right after taking them …

Sausage cut up in small cubes and added to the pot, again 1/8 shutter and all blurry. And normally one should fry the sausage unless one is afraid of nitrosamines …

Adding hot paprika made in hungary, majoran and a soup cube, lord ixus now choose 1/6sec and even though i took 4 images of this they are all blurred argh, next ill cook the ixus or maybe i should cook myself for being to stupid to not realize that iso100 wont work for this

mixed …

closed and the image is quite ok even though its 1/6sec handheld

15min later …

Finished :)

Filed under: Cooking,Pictures — Michael @ 01:26

2007-09-26

Spam karma2

As you certainly have (not) noticed, ive enabled spam karma2, mike melanson worked hard on installing a few anti spam plugins for me, spam karma2 being one of them. thanks mike!

Until now spam was stoped by a simple word list and anything containing a link was as well blocked. On my side procmail sorted the large amount of moderated comments into certain spams and uncertain as well as non spam comments i then from time to time deleted spams which where missed and approved what was blocked incorrectly. That worked quite well actually … but you know if it aint broken then break it ;)

Spam karma2 should hopefully catch all spams and not block legitimate comments ;) It even has the ability to display a captcha if its not certain, just an hour ago it did that for a spamer who tried to sell some “herbs” for long “legs” it didnt help though i had to delete that by hand after the spamer or his script passed the captcha check

If you write a comment and it gets blocked with no captcha or other tests then please send me an email!

Comments welcome!

Filed under: Uncategorized — Michael @ 21:10

2007-09-23

Which digicam should i buy?

A few days ago i came up with the idea of replacing my several years old canon digital ixus 30 with something newer and better. Maybe with a more recent canon ixus (called elph SD+somenumber outside europe), heres a summary of what ive found that has changed:

noise at same ISO? well iam not sure if it improved a little or not, the test images in various digicam tests on the web arent easy to compare due to different resolutions and likely different noise reduction used by the cameras …
ISOs the recent ixus/elphs suport 800 and 1600 which is nice though somewhat noisy, OTOH they lost 50 support which is a pitty and i cant help but wonder why noone has hacked the damn crap to support any ISO. Or has someone?
manual controls no, still not, we need to wait another 10years for canon to remove that #ifndef ELPH in their source code
wide angle 28-105mm on the SD800 and SD870 instead of 35-105mm which is nice but id be more happy if they had 18-55mm or something in that area
max aperture no still at f2.8 and thats the same for all compact cameras i found
weight seems to have gotten worse …
Really important things Face detection, yes really everyone dreamed of it and wanted it now finally we have it in every new camera ;) but seriously wtf. Is the industry totally insane? What is this nonsense good for … if at least it would recognize people and depending on who it is focus on something else :) then it might be usefull but …
Just remember the user of the camera knows what he wants to focus on, the camera does not and cannot, the user might even want to focus on different things in the same situation depending on his mood its not something a camera or another human can guess

So seems like 3 years isnt enough for canon to improve their (ultra) compact cameras. What about the powershot A series? well i had the A95 for a week but it was simply way too heavy and big also its light sensitivity was not any better than the much smaller canon ixus IIRC. Though at least it had manual controls so the camera did what i wanted. And the fuji F30/F31fd? Well fuji stoped producing them and their replacment the F50fd produces vertical stripes with a nice green magenta gradient over its images, making high ISO unuseable without some cleanup. (noise is one thing but stripes even if they are fainter than the noise just isnt something i want in my images) Also the f50fd is noisier than the f31fd though still significantly less noisy then anything produced by its competitors in that size and weight range that i found.

So suggestions welcome, but it looks like i wont buy any new camera. Capitalism has once again demonstrated that it doesnt work, its not as if the consumer would know which product is best and would thus force the industry to produce better products, its rather that the industry prints irrelevant numbers (megapixels) on the cameras and the consumer buys based on that ignoring all relevant parameters. So the industry just needs to exchange the CCD with one with more megapixels and paint the camera differently instead of improving anything relevant.

Filed under: Off Topic — Michael @ 03:35

2007-09-02

Googles summer of code 2007

The company which tries so hard not to be evil, has this year again payed students, 900 according to wikipedia to work on free software. 8 of them worked on ffmpeg this year, luckily the results are less disasterous than last year where half the 5 students which where supposed to work on ffmpeg plain dissapeared, and just reapeared before the deadline to convince us that they where almost finished with everything so as to get payed …

But lets rather talk about this years SOC, first there where 19 students, submitting 37 applications about ffmpeg. To rate these 37 applications and to prevent the high failure rate of last year, we required students to do a qualification task, that is they had to make a not too insignificant contribution to ffmpeg to be accepted. What exactly they did was pretty much their decission though there was a list of suggestions. As a beautifull sideeffect of that, the qualification tasks led to some nice and new features for ffmpeg :).

9 of the 19 students submitted a qualification task, all thouse who submitted one passed, in addition 1 student was qualified through extensive past work on ffmpeg. From these 10 students, 2 sadly couldnt be accepted as they wanted to work on the same task as other students, well in principle it would have been possible to let 2 students work on the same task but it seemed silly. 1 wasnt accepted as his project appeared rather uninterresting and somewhat unrelated to video/audio. The 8th slot google provided was thus given to a student who didnt submit a qualification task. Also the actual decission of who would be accepted and who not was that of the mentors rating applications not of any single person …

So whats the current status?

  • Davids matroska muxer has passed review and should be commited to ffmpeg svn soon
  • Kostyas RV40 decoder looks pretty good and could probably be commited to ffmpeg svn soon, well actually if kosyta wanted he could commit immedeatly and continue work in ffmpeg svn
  • Marcos dirac decoder is also in good shape and theres not much keeping it from being commited, the encoder though needs more work
  • Kamils jpeg 2000 encoder and decoder, arent in good shape yet (only 2 out of 50 encoded images can be decoded by jasper, only 2 of 23 reference jpeg2k files can be decoded by kamils decoder), but then please dont forget that writing an encoder and decoder at the same time is harder then just one of the 2
  • Reynaldos QCELP decoder is missing a working postfilter, 1/4 and 1/8 decoding, so it needs more work as well
  • Bobbys avfilter is in quite good shape though its missing well working colorspace negotiation, also its missing actual integration in ffmpeg.c, only ffplay.c is there
  • Bartlomiejs EAC-3 decoder hasnt been reviewed by me yet …
  • xiaohuis mpeg-ts muxer itself also hasnt been reviewed by me yet, his patches though which where submitted and which ive reviewed need more work

I hope of course that all 8 students continue to work on their projects, free software in general and ffmpeg specifically!

Filed under: Uncategorized — Michael @ 22:55

2007-07-10

Forgotten code, noe and mina

libnoe

libnoe is a library to encode and decode reed solomon codes which i wrote between 2002 and 2006

noe

noe is an application which uses libnoe to generate an error correction file for some data file(s) and use that then to correct a wide varity of possible errors incuding having the data randomly chopped up and reordered. “noe” btw stands for “no error” in case you are wondering, sadly ive never finished the noe application.

The basic idea of how noe would work is that, first the data itself is unchanged, changing it would be inconvenient in many situations. The error correction file is made of many not too large packets, this ensures that any reordering which happens to the error correction file can be corrected by simply searching for the packet headers and looking at some sequence number in the header. The error correction packets now would contain some fingerprints of the data in the datafile(s) that is for example every 100th or 1000th bit of the data file would be stored in some error correction packet in the error correction file. With these fingerprints its possible to detect and correct reorderings which might have happenend to the data file even if just a random subset of the error correction packets are intact. The fingerprints as well as the headers of the error correction packets would contain some small checksums to avoid confusing the code by many wrong values. At last the main content of the error correction packets would simply be interleaved RS codes or more precissely the parity part of them. Btw in case anyone is wondering how data can get randomly choped up and reordered, think of a broken hard disk and fragmented files

Patches to finish noe are of course welcome! :)

mina

mina is the MINimal Alternative which my lazy self did finish. It simlpy takes a file and produces an error correction file which is just a bunch of interleaved RS codes (parity part of them actually) with no header or anything. It also happily eats corrupted files and corrects them

An example of minas correction capability is below, note images have been converted to jpeg to reduce their size and make them vissible in normal browsers. Raw damaged files as tar.gz are available too (mina dz lena.pnm.mina can be used to correct them)

damaged recovered

Source code under GPL and GIT repositoryis available too, its also quite clean and does compile :). History though is sadly quite incomplete like with the other forgotten code, this time though it was IBMs fault as my private CVS server with the whole history of noe was on a IBM deathstar disk and it seems i had no backup of the RCS files (this is also one of the reasons why i make all that stuff public now, to avoid it being lost due to some other hd failure or stupidity …)

patches are welcome !!! :)

Filed under: Error Correcting Codes — Michael @ 02:14

2007-07-08

Reed Solomon codes part 2

Asymptotic complexity of best known (to me ;) ) decoding algorithm

O(n log n + t log2 t) for a (n,k) RS code over GF(n+1) and t=n-k

The proof for this is quite easy, syndrom calculation is just evaluating a polynomial at n-k points, and evaluating a polynomial (in GF(n)) at all points can be done with the GFFT actually evaluation at all points is the GFFT of the polynomial. Multiplying 2 polynomials is just 2xGFFT + componentwise multiplication + IGFFT. Finding the roots of a polynomial can as well be done by just evaluating it at all points. The only non trivial operation left for normal RS decoding is solving the key equation which is equivalent to euclids GCD algorithm as well as schönhages GCD algorithm, later has O(t log2 t) complexity (log2 t == (log t)2 in case thats unclear).

An alternative to GF(2x)

Normally RS codes are build over GF(2x) that way the bits of the elements of an RS codeword have a nice 1:1 mapping to x bits which can then be stored or transmitted, but it has a big disadvantage and that is that the GFFT for GF(y) needed for fast RS decoding is done with y-1 points and so it cannot use the well known power of 2 style FFT algorithms as 2x – 1 is not a multiple of 2. The solution is to use GF(2x+1), though note GF(2x+1) does not exist for all integer values of x, it only exists if 2x+1 is a power of a prime that is pj, 2 obvious choices using fermat primes are GF(28+1) and GF(216+1)

How do you store 2x+1 values in 2x values

Trivial ;)

The data part of our RS code is specified by the user and so it simply doesnt use the 2x+1 th symbol, actually it would be messy to use it. So the only problem left are the n-k parity symbols, which can trivially be transformed to not contain the annoying 2x+1 th symbol while at the same time maintaining the property of being an RS code

Let us assume that we have a symbol (at position y with value yv) in our k input symbols which is guranteed to have a value yv < 2x – n + k that is in practice less than one unused bit. Let p be the RS codeword with all k-1 data symbols 0 and the symbol at position y 1. The next step is to find all the values of the y element in our original codeword which would cause no parity symbol to have that annoying 2x+1 th value, for encoding we simply select the yv th element of this list as new yv element. For decoding we choose the number of elements in the list which are smaller than yv as our new element. As last step we just need to add a scaled version of p so as to actually have the wanted yv element and avoiding the nasty too large elements while also still having an RS code

Filed under: Error Correcting Codes — Michael @ 20:31
« Previous PageNext Page »

Powered by WordPress