View Single Post
  #3  
Old 28-06-08, 20:56
gordon g gordon g is offline  
Senior Member
 
Join Date: Jul 2006
Location: Barnsley
Posts: 2,766
Default

Let's define some terms first.
Your camera captures an image of X by Y pixel dimensions. This defines your native file size - you can make it a bigger file (ie more pixels) by using various enlarging software bits, which all invent extra pixels. You can make it smaller by using the software to remove pixels and smooth out the line and tone changes over the remaining ones. None of this affects resolution ie pixels per inch (PPI).
All PPI represents is how many pixels there are in a given unit of length. Thus a 2700 by 1800 pixel image would appear as a 9" by 6" image at 300ppi, or 18" by 9" at 150ppi.
For prints, 300ppi is a magic number as it is supposed to be the greatest resolution that the human eye can see - no point going any higher than that. In fact, for big prints - say A3 size, you dont really need 300ppi, because people tend to look at them from further off, which means the maximum percievable resolution drops. I print my 18" by 12" at around 250ppi, and even that is probably overkill. The best thing to do here is try some different resolutions on your own images and see how it looks with your camera, lenses and printer.
Pixels per inch refers to the image file though, not the printer DPI (DOTS per inch), which reflects how many ink dots it puts down. Different printers and papers will need different settings, but as a rule of thumb, using a higher dpi will give better tonal graduation and fine detail, unless you get to the point that the paper is overloaded with ink. I find on my epson 2100 that 1400dpi works well for most papers.
JPEG and RAW is really a different thing altogether. This is about file type and data processing, and has virtually nothing to do with pixel dimensions or resolution. My experience is just with Canon digital, so other brands may be different, but should be broadly similar. RAW just records unprocessed data from the sensor, and applies no presets such as sharpness, saturation etc. All that is done in the RAW processing software on the computer. JPEG will compress the data - which is where 'fine' etc settings come in. These define how much compression and simplification of the image data is done in the camera. The camera will also apply some presets such as sharpness, saturation etc. These can usually be defined by the photographer.
If you are absolutely spot on with exposure, white balance etc every time, and dont need to do much if any post-production editing, then JPEG is great - lots of images per card, quick download etc. But if you sometimes need to correct exposure, change white balance, do major conversions, convert to different colour spaces, then RAW may be better as it gives you a lot more flexibility after image capture. (Yes, you can do all of that with jpeg too, but you have already lost data in the camera, and sequencial saves as jpeg quite quickly lead to loss of image quality. Converting your jpeg to a lossless format such as TIFF before major editing is a better way to go)
I'm less sure about colour spaces - someone else here is probably better qualified to advise. I have always used Adobe RGB as my capture space, and until Lightroom came along, as my working colour space too. I now use prophoto as my working space, as Lightroom seems to be set up for that (it has a wider gamut apparently). The output colour space really depends on what the image is being used for. sRGB is probably best for web use, giving the best colour space across a number of platforms.
Hope that helps - anyone, please feel free to correct me on the above!

Last edited by gordon g; 28-06-08 at 20:59.
Reply With Quote