Adobe Photoshop is the standard jpeg editor on Apple Macs. Linux users use GIMP for the same job and and the other 90% of computer users, the Windows users, use Paint Shop Pro or equivalent. Will Windows users switch to Photoshop now that Photoshop’s younger sister, Photoshop Elements 2.0, is given away free with just about every piece of hardware associated with digital images?
One big advantage of Photoshop is the ability to read raw image files. Raw files are the files produced by the digital camera before the digital camera's computer has a chance to damage the file. When you read a raw file in to Photoshop, you can selectively apply changes to improve the image instead of degrading the image with the automatic and random changes the camera's manufacturer selected for the camera.
Think about the word
raw. If you buy raw tomatoes instead of the millions of red coloured products sold as
tomato, you get to decide if you want to cook the tomato, add salt, keep the juice, or eat the tomato like other fruit. You get much more choice than if you buy a processed tomato or red coloured
food like substance. Of course if you want cooked tomato with your meal, you have to learn to cook tomatoes or settle for a processed tomato you can heat up in the microwave oven.
Digital images go through a similar range of processes. The raw image file is read straight from the image chip similar to picking a tomato from the vine. If you read the raw image in to your image editor, you have the most choice but have to do the most work.
The first step in processing a tomato is to cook the tomato and place the cooked tomato in a can. The equivalent in your digital camera is to run the image through a computer then save the image as a TIFF file on the memory chip. The computer takes a guess at what might be errors in the image and erases the information, even if the information is a highlight of your image. Canned tomatoes taste bland compared to an image in the camera is to merge information from several image sensors to create a coloured pixel.
In an 8 million pixel camera, there are 2 million red sensors, 2 million blue sensors, 2 million green sensors, and 2 million other sensors that may be green or a variation on green. Each sensor produces a monochrome measurement to an accuracy of 8, 10, or 12 bits. The camera then guesses what each of the 8 million sensors would have seen had they all had the ability to see three colours. You then get an image containing 8 million pixels each with three colours to a depth of 8 or 16 bits. A 16 bit pixel is built from the 12 bit monochrome measurement at that location plus 3 adjoining 12 bit measurements where at least one is of red, one is of green, and one is of blue. Where you really should get one pixel containing 3 colours each of 12 bits, you instead get 4 pixels each with 3 colours to a depth of 16 bits.