Wednesday, September 29, 2004



  1. School of Engineering & Technology, Bharathidasan University, Trichy.
  2. J.J. College of Engineering & Technology, Trichy.

Biometrics is the measurement of physical, biological or behavioral characteristics of an individual- the face, finger, hand, iris and voice signature used to create a unique identifier which can be electronically stored, retrieved and compared for identification purposes. In simple words, biometrics means using the body as a password. There are various biometric techniques available.

Finger Prints: When you place your finger on the pad of the optical scanner, sensors look for a pattern of minutiae in the print, and match it with the previously recorded print. A single rolled fingerprint may have as many as 100 or more identification points than can be used for identification purposes. When the finger touches the silicon sensor, the pattern of ridges and valleys is determined during a cycle of charging and discharging the capacitor array. The data are scanned at 500 dots per inch, converted to digital form and a map of unique finger characteristics is created on the basis of minutia detection. The map is then compared with a data bank of known prints.

Iris Scanning : As an identifying body part, the human-iris, the coloured protein of eye has several advantages. It is an integral part of the body, so it is not amenable to easy modification. Unlike fingerprints, the iris can be imaged from one meter away. Yet, like fingerprints iris patterns are unique to individuals, inclusive, non exception identical twins. These patterns are stable throughout life. The iris algorithm precisely locates the outer and inner borders of the iris, detects and excludes the cycloids if they cover a part of the iris. The system uses a mathematical technique called wavelet technique analysis to translate the image of the iris into a 512-bit pattern. Wavelet analysis is a mathematical relative of the Fourier analysis, and it breaks down an image into a set of specially limited waves. This pattern, which is called the iris code, is defined in a coordinate system that is invariant to change in pupil contraction and to the size of the image itself. Once an iris code is prepared, the algorithm compares a specific code against a group of codes previously stored in the computer. If there is no match, the fraction of disagreement should be close to 0.5, otherwise it should be close to 0.

Face Off : Facial recognition works by isolating human faces in still pictures and measuring an array of facial characteristics, such a the geometry of a person’s eyes, mouth and nose. Using a proprietary algorithm, the system compares the image to database stored photos for probability ranked matches, as certain facial aspects do not change, even with age or weight fluctuations.

Voice Biometrics and Voice Prints : A voice print is a digital representation of the unique features of an individuals voice. Voices differ due to a number of physiological characteristics such as voice chords, trachea, nasal passages, movement of the tongue inside one’s mouth to produce certain sounds, etc. The combination of these characteristics is analyzed and identified as unique for every individual. A voiceprint is not a sound file (recording) of the individual and therefore cannot be replaced as a recording by an imposter.
Multi-Biometrics: Fingerprint-mosaicking : The reduced contact area offered by solid state fingerprint sensors do not provide sufficient information (ex. -Minutia) for high accuracy user verifications. Multiple impression of the same finger acquired by these sensors, may have only a small region of overlap thereby affecting the matching performance of the verification system. To deal with this problem, a fingerprint-mosaicking scheme that constructs a composite fingerprint image using multiple fingerprints is used.

In this algorithm, two impressions of a finger are initially aligned using the corresponding minutia points. The alignment is used by the well known interactive closet point algorithm (ICP) to compute a transformation matrix that defines the special relationship between the two impressions. The transformation matrix is used in two ways. (a) The two impressions are stitched together to generate a composite image. Minutiae points are then detected in this composite image. (b) The minutia maps obtained from each of the individual impressions are integrated to create a larger minutia map. The availability of a composite template improves the performance of the fingerprint matching system.

Dental Biometrics : Given a dental record, usually as a postmortem (PM) radiograph, we need to search the database of ante mortem (AM) radiographs to determine the identity of the persons associated with the PM image. Semi-automatic method is used to extract shapes of the teeth from the AM and PM radiographs.

A ranking of matching scores is generated based on the distance between the AM and PM tooth shapes. Dental images based on tooth shapes and their relative position is a feasible method for human identification.

Iris scanning & finger print imaging has been found to be effective when compared with other biometric techniques. Further to substantiate the Connecticut welfare programme ID cards, certain photograph & signature with encoded finger print. In future we may encounter newer technologies in biometrics for better human identification.

Map of Unique finger characteristics



Fig. 1 Fig. 2 Fig. 3 Fig. 4

Tuesday, September 28, 2004



1. School of Engineering & Technology, Bharathidasan University

2. J.J. College of Engineering & Technology, Trichy.

In 1989, Global System for Mobile communication’s (GSM) responsibility was transferred to the European Telecommunication Standards Institute (ETSI), and Phase-I of the GSM specifications were published in 1990. Commercial service of GSM service was started in mid 1991.

GSM uses a variation of Time Division Multiple Access (TDMA) and is the most widely used of the three digital wireless telephone technologies (TDMA,GSM and CDMA). GSM digitizes and compresses data, then sends it down a channel with two other streams of user data, each in its own time slot. GSM is a form of multiplexing, which divides the available bandwidth among the different channels. The digital nature of GSM allows, both synchronous and asynchronous data to be transported as a bearer service to or from an ISDN terminal. The data rates supported by GSM are 300bps, 600bps, 1200bps, 2400bps and 9600 bps. The most basic teleservice supported by GSM is telephony. A unique feature of GSM compared to older analog system is the Short Message Service (SMS).

Services like mobile banking, ticket booking, info services are today exclusively available on GSM (TDMA) networks only. A GSM (TDMA) mobile has a Subscriber Identity Module (SIM) card, which provides more functionality and is convenient (eg, change your phone, but keep your phone numbers and settings). Above all, you can take a GSM (TDMA) phone to virtually anywhere in the world and keep talking. Though GSM (TDMA) will not accommodate more than a finite number of users (the user will get the Network Busy message, if this number is exceeded) there won’t be any deterioration in voice quality due to traffic. In addition GSM (TDMA) is also equipped with Frequency-Hopping i.e., when a lower frequency is cluttered, the mobile phone effortlessly jumps to a higher frequency (eg, 900 MHz to 1800 MHz). GSM (TDMA) technologies also employ the Enhanced Frequency Rate (EFR) add-on, which improves the voice quality greatly.

Code Division Multiple Access (CDMA), is a digital wireless technology that was pioneered and developed by Qualcomm. CDMA commercially introduced in 1995. Code Division Multiple Access (CDMA) is “spread spectrum’ technology which means that it spreads the information contained in particular signal of interest, over a much greater bandwidth than the original signal. A CDMA call starts with a rate of 9600 bits per second (9.6 kilobits per second). This is then spread to a transmitted rate of about 1.23 Megabits per second. Spreading means that digital codes are applied to the data bits associated with users in a cell. The goal of the spread spectrum is a substantial increase in bandwidth of an information-bearing signal, far beyond that needed for basic communication. CDMA uses a unique code to distinguish each different call, which enables many more people to share the airways at the same time without static, cross-talk or interference.

CDMA is a form of multiplexing (access to the same resource will be given to more than one user), which allows the use of a particular frequency for a number of signals, optimizing the use of available bandwidth.

With CDMA, unique digital codes, rather than separate Radio Frequencies (RF) or channels are used to differentiate subscribers. The codes are shared by both mobile station (cellular phone) and the base station (The base station connects the mobile to the Mobile Switching Centre [MSC]), and are called pseudo-random code frequencies. Since each user is separated by a unique code, all users can share the same frequency band (range of radio spectrum). This gives many unique advantages to the CDMA technique over other Radio Frequency (RF) technique in cellular communication.

CDMA is typically done through Wireless in Local Loop (WiLL / WLL) systems where only the last mile is wireless (instead of copper wires). In CDMA, you are stuck with one WLL mobile operator, because the WLL handset is programmed and locked to work with the service provider who sells it to you. It does not have a SIM card. It is expensive and time-consuming to reprogram the handset in case you want to change your service provider. In CDMA, unfortunately your handset being stolen/ lost/ damaged, you cannot be reconnected immediately. You would have to procure a new handset, get it reprogrammed. Since, CDMA is a new technology, the network is not setup to provide as many facilities as GSM(TDMA). CDMA is being the standard for mobile communication in very few countries, it cannot offer international roaming, which is a large disadvantage. CDMA technology has a Soft Accommodation feature, i.e., when the number of users of the network goes up, the voice quality progressively gets poorer.

In GSM (TDMA) technology’s talk-range from a tower is 35 kms in comparison with CMDA’s 110 kms, and the power output of a GMS (TDMA) phone is 2W in comparison with CDMA phone’s 200mW, i.e., CDMA implies lesser radiation hazard. For a city like Chennai (typical Indian city), the GSM grids requires approximately 130 base stations to cover the city. WiLL/ WLL requires only about 10 base stations in Chennai as they beam all frequencies received by everyone. Because CDMA uses the entire frequency spectrum and hence can have the broadcast happen at very high signal strength spread over the radius.

The number of channels (users) that can be allocated in a given bandwidth is comparatively higher for CDMA than for GSM. The cost of setting up a CDMA network is also comparatively less than the GSM network. Due to these advantages, there is high probability that CDMA technology will dominate the future of mobile communication.

The following parameters may help you to take a decision based on the importance of the following in your usage pattern. (1)Mobility (i.e.,between different circles), (2) Short Message Servicing, (3) Frequency of handset change (or the need to change handset) and (4) Your expenses on mobile. Incase you are a heavy user on first three parameters, then go for GSM and wait for CDMA to grow up. If the last parameter dominates your mobile life and compromise on the first three you can go for CDMA.

Monday, June 28, 2004



1. School of Engineering & Technology, Bharathidasan University, Trichy.
J.J. College of Engineering & Technology, Trichy.

The term ‘Digital Image Processing’ generally refers to processing of a two-dimensional picture by a digital computer. A digital image is an array of real or complex numbers represented by a fine number of bits. Image processing field deals with the improvement of images for human perception. Images can have either analog or digital representation.

One of the various fields in Image Processing is ‘Image Enhancement’. Image enhancement consists of a collection of techniques that seek to improve the visual appearance of an image or to convert the image into a form better suited for analysis by a human or machine.

Image Representation : The digital image is represented as a two-dimensional array of numbers. In other words, the digital image can be considered a matrix whose row and column indices identify a point in the image and the corresponding matrix element value identifies the gray level at that point. The elements of such a digital array are called ‘Image elements’ orPicture elements’ or ‘pixels’ or ‘pets’.

If each gray level is represented as 8 bits, then there are 28 or 256 possible gray levels. These levels are usually assigned integer values ranging from 0 to 255 with 0- representing the darkest intensity level, and 255 the brightest intensity level.

Image Enhancement : Image enhancement is the basic tool used to improve the visual appearance of images for human perception. Image enhancement is useful in feature extraction, image analysis and visual information display. Image Restoration: Image restoration refers to removal or minimization of known degradations in an image. This includes de-blurring of images degraded by the limitations of a sensor or its noise filtering and correction of geometric distortion.
Image Analysis : Image analysis is concerned with making quantitative measurements from an image to produce the description of it. Image Reconstruction. : Image reconstruction from projections is a special class of image restoration problems where a two or higher dimensional object is reconstructed from several one-dimensional projections. Image Data Compression: Image data compression techniques are concerned with reduction of the number of bits required to store or transmit images without any appreciable loss of information.

Image enhancement remains a very important topic because of its usefulness in all image-processing applications. The enhancement process does not increase the inherent information content in the data. But it does increase the dynamic range of the chosen feature so that they can be detected easily.

One of the main drawbacks of histogram equalization is that the brightness of an image is changed after equalization. Thus, it is rarely utilized in consumer electronic products such as TV where preserving original brightness is necessary in order not to introduce any unnecessary visual detoriation. A darker image becomes much brighter and vice versa. In short, the histogram equalization does not take the mean brightness of an image into account.

The ultimate goal of the bi-histogram equalization algorithm is to preserve the mean brightness of a given image while the contrast is enhanced. Brightness Preserving Bi-Histogram Equalization (BBHE) is the novel extension of histogram equalization. By this algorithm, the contrast of the image is enhanced without changing the mean brightness of input image.

Brightness preserving bi-histogram equalization (BBHE) firstly decomposes an input image into two-sub images based on the mean of the input image. One of the sub-images is the set of samples less than or equal to the mean, whereas the other one is the set of samples greater than the mean. Then the BBHE equalize the sub-images independently based on their respective histograms with a constraint that the samples in the formal set is mapped into the range from minimum gray level to the input mean and the samples in the latter set are mapped into the range from mean to maximum gray level. In other words, one of the sub-images is equalized over the range up to the mean, and the other sub-image is equalized over the range from the mean based on the respective histograms. Thus, the resulting equalized sub-images are bound by each other around the input mean, which has an effect of preserving the mean brightness.

Thorough understanding of this article makes anyone to visualize the better enhancement by bi-histogram equalization. It is a newly-developed contrast enhancement algorithm. Based on this algorithm, it is clear that the brightness preserving bi-histogram equalization is capable of preserving the mean brightness of the given image while enhancing the contrast.

Sunday, April 25, 2004



1. School of Engineering & Technology, Bharathidasan University

Indian Economic sectors, viz agriculture and industry are getting exposed to an entirely new set of technologies. The development of Biotechnology in agriculture arises to improve the productivity of important cash crop. However, in the industrial sector, biotechnology helps in production of neural components and enzymes. India is one of the first few countries among the developing countries, to have recognized the importance of biotechnology as a tool advance the growth of agriculture and health sectors as early as 1980. The rapid studies in the development of biotechnology sector in India are posed to reach USD 1.8 billion this year.

This quantum of growth was achieved due to availability of trained manpower, vast knowledge base, growing multinational companies and indigenous R&D efforts. The prospects like vaccines, diagnostics, bioactive, therapeutic proteins, hybrid seeds, bio pesticides are opening up vast opportunities in our country. In the global biotech market India’s share is significantly increasing. The consumption of biotech products in India was $1789 million during 1999 and is expected to increase to the tune of $ 4270 million by the end of 2010. The rich human capital of India is believed to be a strongest asset having large English speaking skill base, three million graduates, 700 thousand post graduates and enormous rich talent of Ph.D qualified in Biosciences and engineering. It is estimated that 10% research scientist in Pharma/ Biotech R&D are of Indian origin in the USA. Biotech industries in India at present are at the threshold of tremendous growth. There is an increase of 20% approximately $ 230 million in human and animal product segments.

India’s first genetically engineered vaccine, produced by Shanta Biotech against Hepatitis B, costs less than 50% (approximately $ 4) when compared to vaccines marketed from the developed countries. The investments opportunities in India are very promising fresh investment potential of about $ 200 million turn over in the next 5 to 10 years is expected. According to the McKinsey study, Indian Pharma/biotech industry is poised to grow by $ 25 billion by 2010. Vaccine market in 2001 was $ 100 million. In 2003 streptokinase, interferon alpha 2b & EPO are released.

The Indian government has granted marketing license for about 25recombinant therapeutics. Of about 11 recombinant products are already approved by Genetic Engineering Approval Company (GEAC) namely insulin, interferon alpha, interferon gama, interleukin 2,
Gm-CSF, Hepatitis B vaccine, G-CSF, erythropoietin, streptokinase, EGF and Chymo trypsm. Indian biotech firm like shanta biotech has launched other products like plasminogn interferon vaccine following the suit of globalization many multinationals like Monsanto, Dupont, Bayer have set up their business in
India. Collaborative ventures like Eli-lilly, Ranbaxy and Hoechst Roussell vet has developed a low abortion IBH vaccine India.

Over the last decade the number of biotech industries in India has grown up at rapid phase. There are about 170 Bitoech based companies in India. It is expected that nearly all products in the future would become biotech oriented. As more and more organizations continue to embrace biotech based techniques, A time would come when one would be hard pressed to distinguish biotech from the mainstream. Having said that, the kind of support the government has shown to this sector and the missionary zeal with which the sector is being promoted by various industry associations such as CII, ASSOCHAM & FICCI, the future of the biotech industry in India undoubtedly seems bright.

Monday, March 29, 2004



1. School of Engineering & Technology, Bharathidasan University.

2. Department of Biotechnology, Bharathidasan University.

3. Periyar E.V.R. College, Tiruchirappalli.

Nanotechnology is being heralded as an important technology that will replace most of the existing technologies in use today. In 1974 Noria Taniguchi used the term nanotechnology. It is widely published as a new technology that is going to change every aspect of our life and lead the generation of new capabilities, and new products at new markets. Nanotechnology is multidisciplinary in nature. Its impact in society is going to be widespread and all-pervasive. A number of new firms have been established in recent years with a specific objective of exploiting one or more of the several avenues that this new technology provides.


A nanometer is one thousandth of a micron or one millionth of a millimeter or 10-9 of a meter or is roughly the length occupied by five to ten atoms stacked in a straight line. The hydrogen atom measures 0.1 nanometer, while a virus may be about 100 nanometer in size and an RBC approximately 10,000 nanometer in diameter. The virus is believed to be the most delicate nanocomponent in nature. Nanotechnology represents the convergence of modern chemistry, physics and biology.

The living cells that first emerged over 3.5billion years ago are the best specimens of machines that operate at the nanoscales that performed a host of jobs like generating energy. There is no process that has been engineered by mankind, either chemical biotechnology or mechanical that has been able to reach anywhere near the levels of perfection observed in living cells.


Nanotechnology is defined to be dealing with materials in the range of 0.1 to 100 nanometer. The Scanning Tunneling Microbes (STM) invented in 1981 permitted human beings to see atoms. The Atomic Force Microscope (AFM), scanning probe microscope, optical technique, lithographic tools, nuclear techniques, nuclear magnetic resonance and laser equipment as well as computer modeling techniques are used to work at the scale of nanometers.

It is an emerging science in its infancy, nanotechnology is more descriptively known as molecular manufacturing at atomic scale. Nanotechnology is a bottom-up approach (defect free structures) whereas manufacturing of silicon chips is the top-down approach. It is a field at the function of chemistry, physics, biology, computer science and engineering.

Impact of nanotechnology in our life.

  1. Clean and abundant energy
  2. Pollution free and inexpensive materials.
  3. Defect free materials.
  4. Environmental restoration and clean up.
  5. Safe space travel and colonization.
  6. Advancement in medicine.
  7. Carries hydrophilicity and surface charge.
  8. Nanoparticle coated (tween 80) AZT was more in Liver, lung, and spleen compared to noncoated AZT. Nanoparticles are able to target drugs to certain organs. They are promising in AIDS therapy.
  9. Heating up the tumor (45oC) (Hypothermia) at precise points (Nanospheres sugar coated magnetic iron-dioxide particles- ferrofluids) help fight against targeted cancer cells.
  10. Atomic Force Micoscope is used as tools to grab individual atoms and molecules and reposition them.
  11. In nanotubes considerable quantities of hydrogen has been stored (Future energy carriers).
  12. Nanoshell-polymer (acrylamide) composite drug delivery material when optically absorbing gold nanoshells are embedded in a matrix, illuminating them at resonance wavelength causes the nanoshells to transfer heat to the local environment. Remote control drug release can be effected.

Text Box: BSA release

Biosynthesis of nanoparticles using Fungi and Actinomycete

1. Microorganisms can be used in the synthesis of inorganic nanoparticles (metal nanoparticles).

2. Bacteria, yeasts, algae, fungi and actinomycetes.

3. Viable alternative to chemical methods.

4. Extracellular secretion of enzymes for Nanoparticle synthesis (Nanoparticles are formed as by-products of reduction process).

5. Nanoparticle of different chemical composition different shapes and sizes.

6. Genetically organisms can be engineered for this purpose.

7. Natural nanofactories.

8. Cd2+, Ag+, AuCl4-, SO4- (Reduced sulphates to sulphides).

Nanotechnology development

Government spending on Nanotechnology (2002)

Country Amount (Millions of US dollars)

Japan 750

China 200

Taiwan 111

Korea 150

Singapore 40

US Government focus areas in Nanotechnology

Amount (Millions of US dollars)

Research and Development 221

Defence 201

Energy 139

Standards 44

Health 43

Space 51


There is a projected worldwide market size over $1 trillion annually in the next 10 to 15 years. Nanomaterials can create a wealth of new business opportunities.There is a general consensus that nanotechnology will be a big and dominant industry. A large number of firms have opened up units devoted to nanotechnology. Of these firms, around 100 are public firms; large multinationals and the remaining are private firms. Companies such as Dupont, DOW, BASF are among the large companies that are focusing on nanomaterials research for specific applications related to semiconductor chemicals, LCD panels and automobile parts focusing on new nanomaterials. Other large companies such as IBM, Hewlett Packard and Bell Labs are focusing on electronic components and memory. Several leading biotechnology companies including Amgen. Genetech &Pharmacopoeia are turning their focus towards health-related nanotechnology applications.


  1. Molecular Electronics

A conventional semiconductor device follow Moore’s law to approach their physical limits Hybrid circuits incorporate conventional as well as molecular components (Quantum dots, single electron transistors) to control electron tunneling to amplify current. Nanotechnology based memory chips (CPUs) are also available.

  1. Sensor

Carbon nanotube carbon di-oxide sensor measures CO2 conc.

  1. Aerospace

High temperature, extreme pressure hard vacuum and higher radiation in aerospace, the development of heat-resistant polymers and other materials, miniature computers, molecular machines based on chemistry that can survive in space and assembly methods compatible with conditions in space will greatly benefit aerospace applications.

Nanomedicines : Nanotechnology makes the construction of Micron-scale machines possible. RESPIROCYTES represent micron-sized diamondoid oxygen storage tanks floating in the blood stream in artificial mechanical RBC.

Delivery of drugs to targeted sites – Techniques for drug release and injection of substances into cells, nanotube syringes have been developed. Nano-engineered prosthetics (artificial bones) have also been developed.

Environment and sanitation: nanotech-based materials, a layer of functional groups can remove heavy metals (Hg) from aqueous and organic liquids. Nanotech machines for water treatment extraction of toxics & detection of pollutants.

Carbon Nanotubes (CNT)

  1. Excellent electron field emitters
  2. High mechanical strength
  3. High thermal conductivity.
  4. Excellent chemical and thermal stability.
  5. Emit electrons at room temperature.
  6. The emission current can be continuous (GHz).

DNA based Molecular nanotechnology

Nucleic acids have the special ability of self-organization. Genomic DNA acts as frame work for the building blocks. First application of DNA Nanoparticle is “chip technology” for identification of DNA molecule.

Gold Nanoparticle bio-conjugate based colorimetric assay.

The characteristic red of gold colloid (mercapto alkyloligo nucleotide) has long been known, which changes to bluish-purple color upon colloid aggregation. Gold-colloid (Red) + SS target oligonucleotide -> target oligonucleotide + conjugated oligonucleotide (dramatic red to blue macroscopic color change). This colorimetric method can be used to detect ~ 10 fmolar of (10-15) an oligonucleotide which is 50 times more sensitive than sandwich hybridization.


Most drugs have pharmacological effects but also exhibit side effects. Therefore, drug targeting becomes necessary. Paul Ehrlich first described “Magic Bullet” which guides a drug directly into target cell and the drug will not affect surrounding cells.

Nanoparticles are solid colloidal particles ranging in size from 10nm to 1000nm. They consist of macromolecular materials in which the active principle (drug or biologically active principle) is dissolved, entrapped or encapsulated.

Requirements of an ideal vector (Nanoparticles) for drug targeting

  1. High stability
  2. Capable of extended circulation in the blood stream.
  3. Small enough to gain excess to target tissues/ target cells.
  4. Flexible tropism (disease targets).
  5. Must be able to deliver active moiety into the cells.
  6. Must be capable of escaping endosome- lysosome processing.
  7. Nanoparticles: Matrix biodegradable in nature.


  1. Nanotechnology is intertwined with biotechnology and information technology.
  2. General funding from government and venture funds should support nanotechnology research.
  3. It is time that India forges a nanotechnology policy in tune with the specific needs of the country.
  4. Nanotechnology will be a dominant force in the days to come.
  5. Institutes like IIT, ISC, CCMB may become Nation’s hub for nanotechnology research in the future.
  6. The myriad number of applications of nanotechnology is diversified fields of industries is highlighted in Fig.1.

The Bead ARray Counter (BARC)

Streptoridin Anthrax

Magnetic Field Sensor

Blowarfare DNA

Probe DNA