0% found this document useful (0 votes)
124 views

Lecture 2-Image Acquisition Image Representation

Digital images are represented as 2D arrays of pixels, each with an intensity value. An image is acquired by sampling and quantizing an analog scene into discrete pixel values. Spatial resolution is determined by sampling rate, while intensity resolution depends on the number of quantization levels. Histograms show the frequency distribution of pixel intensities and can help analyze images for issues like over/under exposure.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
124 views

Lecture 2-Image Acquisition Image Representation

Digital images are represented as 2D arrays of pixels, each with an intensity value. An image is acquired by sampling and quantizing an analog scene into discrete pixel values. Spatial resolution is determined by sampling rate, while intensity resolution depends on the number of quantization levels. Histograms show the frequency distribution of pixel intensities and can help analyze images for issues like over/under exposure.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 66

Image Acquisition & Image

Representation
Lecture 2
EEE429-Image and Video Communications / EME408-Image Processing
and Vision
Nuwan Vithanage
Relationship with Image Processing &
Computer Vision
Imaging
• The first digital photo came in 1957 when Russell Kirsch made a
176×176 pixel digital image by scanning a photograph of his three-
month-old son

First Digital Photo Original Photo


Digital Image Definitions
• “A digital image f [x, y] described in a 2D discrete space is derived
from an analog image f (x, y) in a 2D continuous space through a
sampling process that is frequently referred to as digitization”
• Digital image: discrete samples f [x, y] representing continuous
image f (x,y)
• Each element of the 2-d array f [x,y] is called a pixel or pel (from
“picture element“)
Digital Image Definitions
• 2-dimensional matrix of intensity (gray or color values)
Digital Image Definitions
Digital Image ?
• Remember: digitization causes a digital image to become an
approximation of a real scene
Examples of Digital Images
Digital Image
• Common image formats include:
• 1 values per point/pixel (B&W or Grayscale)
• 3 values per point/pixel (Red, Green, and Blue)
• 4 values per point/pixel (Red, Green, Blue, + “Alpha” or Opacity)

• We will start with gray‐scale images, extend to color later


What is image processing?
• Algorithms that alter an input image to create new image
• Input is image, output is image

• Improves an image for human interpretation in ways including:


• Image display and printing
• Image editing
• Image enhancement
• Image compression
Key Stages in Digital Image Processing
Key Stages in Digital Image Processing:
Image Acquisition
Key Stages in Digital Image Processing:
Image Enhancement
Key Stages in Digital Image Processing:
Image Restoration
Key Stages in Digital Image Processing:
Morphological Processing
Key Stages in Digital Image Processing:
Segmentation
Key Stages in Digital Image Processing:
Representation & Description
Key Stages in Digital Image Processing:
Object Recognition
Key Stages in Digital Image Processing:
Image Compression
Key Stages in Digital Image Processing:
Color Image Processing
Human Visual System: Structure Of The Human
Eye
• The lens focuses light from objects onto the
retina
• Retina covered with light receptors called
cones (6 ‐ 7 million) and rods (75 ‐150
million)
• Cones concentrated around fovea. Very
sensitive to color
• Rods more spread out and sensitive to low
illumination levels
Image Formation In The Eye
• Muscles in eye can change the shape of the lens allowing us focus on
near or far objects
• An image is focused onto retina exciting the rods and cones and send
signals to the brain
Imaging System
Brightness Adaptation & Discrimination
• The human visual system can perceive approximately 1010 different
light intensity levels
• However, at any one time we can only discriminate between a much
smaller number – brightness adaptation
• Similarly, perceived intensity of a region is related to the light
intensities of the regions surrounding it
Brightness Adaptation & Discrimination:
Mach Band Effect
Brightness Adaptation & Discrimination
Image Acquisition
• Images typically generated by illuminating a scene and absorbing
energy reflected by scene objects
Image Sensing
• Incoming energy (e.g. light) lands on a sensor material responsive to
that type of energy, generating a voltage
• Collections of sensors are arranged to capture images
Image (Spatial) Sampling
• A digital sensor can only measure a limited number of samples at a
discrete set of energy levels
• Sampling can be thought of as: Continuous signal x comb function
Image Quantization
• Quantization: process of converting continuous analog signal into its
digital representation
• Discretize image I(u,v) values
• Limit values image can take
Image Sampling And Quantization
• Sampling and quantization generates approximation of a real world
scene
Image as Discrete Function
• After spatial sampling and quantization, an image is s discrete
function
• The image domain Ω is now discrete:

and so is the image range:

where
Image as a Function
Representing Images
• Image data structure is 2D array of pixel values
• Pixel values are gray levels in range 0‐255 or RGB colors
• Array values can be any data type (bit, byte, int, float, double, etc.)
Spatial Resolution
• The spatial resolution of an image is determined by how fine/coarse
sampling was carried out
• Spatial resolution: smallest discernable image detail
• Vision specialists talk about image resolution
• Graphic designers talk about dots per inch (DPI)
Spatial Resolution
Spatial Resolution: Stretched Images
Intensity Level Resolution
• Intensity level resolution: number of intensity levels used to
represent the image
• The more intensity levels used, the finer the level of detail discernable in an
image
• Intensity level resolution usually given in terms of number of bits used to
store each intensity level
Intensity Level Resolution
Intensity Level Resolution
Saturation & Noise
Resolution: How Much Is Enough?
• The big question with resolution is always
how much is enough?
• Depends on what is in the image (details) and
what you would like to do with it (applications)
• Key questions:
• Does image look aesthetically pleasing?
• Can you see what you need to see in image?

• Example: Picture on right okay for counting


number of cars, but not for reading the
number plate
Image File Formats
• Hundreds of image file formats. Examples
• Tagged Image File Format (TIFF)
• Graphics Interchange Format (GIF)
• Portable Network Graphics (PNG)
• JPEG, BMP, Portable Bitmap Format (PBM), etc
• Image pixel values can be
• Grayscale: 0 – 255 range
• Binary: 0 or 1
• Color: RGB colors in 0‐255 range (or other color model)
• Application specific (e.g. floating point values in astronomy)
How many Bits Per Image Element?
Example Operations: Noise Removal
Examples: Noise Removal
Example: Contrast Adjustment
Example: Edge Detection
Example: Region Detection, Segmentation
Example: Image Compression
Example: Image Inpainting
Examples: Artistic (Movie Special )Effects
Applications of Image Processing
Applications of Image Processing: Medicine
Applications of Image Processing: GIS
Applications of Image Processing:
Law Enforcement
Applications of Image Processing: HCI
Histograms
• Histograms plots how many times (frequency) each intensity value in
image occurs
• Example:
• Image (left) has 256 distinct gray levels (8 bits)
• Histogram (right) shows frequency (how many times) each gray level occurs
Histograms
• Many cameras display real time histograms of scene
• Helps avoid taking over‐exposed pictures
• Also easier to detect types of processing previously applied to image
Histograms

• E.g. K = 16, 10 pixels have intensity value = 2


• Histograms: only statistical information
• No indication of location of pixels
Histograms
• Different images can have same histogram
• 3 images below have same histogram

• Half of pixels are gray, half are white


• Same histogram = same statistics
• Distribution of intensities could be different
• Can we reconstruct image from histogram? No!
Histograms
• So, a histogram for a grayscale image with intensity values in range

would contain exactly K entries


• E.g. 8‐bit grayscale image, K = 28 = 256
• Each histogram entry is defined as:
• h(i) = number of pixels with intensity I for all 0 < i < K.
• E.g: h(255) = number of pixels with intensity = 255
• Formal definition
Interpreting Histograms
• Log scale makes low values more visible
Histograms
• Histograms help detect image acquisition issues
• Problems with image can be identified on histogram
• Over and under exposure
• Brightness
• Contrast
• Dynamic Range
• Point operations can be used to alter histogram. E.g
• Addition
• Multiplication
• Exp and Log
• Intensity Windowing (Contrast Modification)
Summary
• We have looked at:
• What is digital image?
• How to acquire a digital image?
• What is quantization and sampling?
• What is spatial and intensity resolution?
• Next time we will continue talk more about Histograms, Image
Contrast, Intensity transform, Point Operations and intensity
windowing

You might also like