Computer Aided Diagnosis For Lung Cancer Biology Essay

Cancer is a disease in which unnatural cells of the organic structure cells divide really fast, and bring forth excessively much tissue that forms a tumour. Cancer cells are capable of distributing to other parts of the organic structure through the blood and lymph systems. There are many types of malignant neoplastic diseases.

When the uncontrolled cell growing occurs in one or both lungs, it is said to be Lung Cancer. Besides, developing into a healthy, normal lung tissue, these unnatural cells continue spliting and organize balls or multitudes of tissue called tumours. The chief map of the lung which is to transport the blood stream with O to the full organic structure is disturbed by these tumours.

Types of Lung Cancer

Cancers that begin in the lungs are divided into two major types, non-small cell lung malignant neoplastic disease and little cell lung malignant neoplastic disease, depending on how the cells look under a microscope. Each type of lung malignant neoplastic disease grows and spreads in different ways and is treated otherwise.

Small cell lung malignant neoplastic disease ( SCLC )

This is normally believed to be a systemic disease at the clip of diagnosing and therefore surgery plays no portion in the direction of this disease.

SCLC presenting

Limited disease: It is limited to one hemi thorax that can be integrated in a sensible field of pectoral radiation therapy.

Extensive disease – It is beyond one hemi thorax or that can non be incorporated in a sensible field of pectoral radiation therapy.

Non-small cell lung malignant neoplastic disease ( NSCLC )

NSCLC aggravates to a great extent subsequently in its class than SCLC and accordingly surgery is the best opportunity of remedy. Patients those who are considered for surgical intervention must be carefully staged to find tumour resectability. PET will assist to measure modal in engagement. But, merely 15 % of patients are appropriate for resection at diagnosing. The patient must besides be carefully assessed pre-operatively to guarantee fittingness for surgery.

Small-cell lung malignant neoplastic disease ( SCLC ) differs from non-small-cell lung malignant neoplastic disease in the undermentioned ways:

SCLC grows rapidly.

SCLC spreads rapidly.

SCLC responds good to chemotherapy and radiation therapy

SCLC is often associated with distinguishable paraneoplastic syndromes

Lung malignant neoplastic disease is one of the most unsafe malignant neoplastic diseases in the universe, with the smallest endurance rate after the diagnosing, with a gradual addition in the mortality rate every twelvemonth. The survival chance from lung malignant neoplastic disease is indirectly relative to its growing at its sensing clip. The opportunities of successful intervention are possible merely if the disease is detected at the earlier phase. An estimated consequence shows that 85 % of lung malignant neoplastic disease instances in males and 75 % in females are caused by coffin nail smoke [ 104 ] . The overall endurance rate for all types of malignant neoplastic disease is 63 % . Though surgery, radiation therapy, and chemotherapy have been used in the intervention of lung malignant neoplastic disease, the five-year endurance rate for all phases combined is merely 14 % . This has non changed in the past three decennaries [ 105 ] .

Tocopherol: THESIS_ WORKGomathi_KONGUlung-cancer.jpg

Figure 4.1: An illustration of Lung Cancer

Several thin-sectional CT images are produced in clinic for each patient and are estimated by a radiotherapist in the traditional sense of looking at each image in the axial manner. Most of the images will be really tough to construe and devour batch of clip that cause high false-negative rates for observing little lung nodules, and therefore potentially misses a malignant neoplastic disease. The cardinal thought of planing a CAD system is to do a machine algorithm acts as a support to the radiotherapist and points out locations of dubious objects, so that the overall sensitive rate is raised.

CAD system must accomplish following demands

bettering the quality and truth of diagnosing,

increasing therapy success by early sensing of malignant neoplastic disease,

avoiding unneeded biopsies

Reducing radiologist reading clip [ 106 ] .

A CAD system for early sensing of lung malignant neoplastic disease based on an automatic diagnosing of the lung parts included in thorax CT images utilizing the nervous web is proposed in this chapter. Fuzzy Possibilistic c-mean ( FPCM ) is used for constellating in the proposed attack.


Assorted nervous web techniques have been employed in the malignant neoplastic disease sensing attacks. Recently, ANNs is a chief research country in wellness attention mold and it is believed that they will have extended application to biomedical systems in the following old ages [ 107 ] . Nervous webs learn by illustrations and so the inside informations of how to acknowledge the disease is non needed. A set of illustrations ( forms ) is merely needed that are representative of all the fluctuations of the peculiar disease. A high truth degree in the disease acknowledgment is obtained by carefully taking the forms.

Artificial nervous webs ( ANN )

These are basic theoretical accounts of the biological nervous system and are inspired from the sort of calculating executed by a human encephalon. An ANN is an highly parallel distributed processing system made up of extremely interrelated nervous calculating elements that urn the ability to larn and therefore get cognition and do it available for usage [ 108 ] . The informations obtained by electrical electric resistance spectrometry has a strong relation with soft calculating in placing cancerous country from the normal country. So ANN which is an information processing system can be used as an appropriate tool for the malignant neoplastic disease sensing.

Certain public presentation feature of ANN is common with biological nervous webs. An ANN contains some nodes which are connected through weights. Each node obtains informations from behind nodes, attaches it and passes informations via a nonlinear map, and so propagates informations to continuing nodes. ANN public presentation is in two stages:

preparation stage

trial stage

The input forms are offered to the ANN and weights are adjusted and fixed to larn these forms in the preparation stage. ANN surely learns input forms in larning stage. On the other manus, the forms which are non used in preparation stage are presented to the ANN in the trial stage and the ANN ‘s end products are used to gauge its public presentation [ 109 ] . If the public presentation of ANN is satisfactory, it can be used in its ain specific application.

Artificial Neural Network Structures

Nervous Networks have been widely used in assorted applications like pattern categorization, pattern completion, map estimate, optimisation, anticipation and automatic control. ANNs are classified into two classs like supervised and unsupervised acquisition. ANN is supervised merely if the end products of the input forms used in the preparation stage of the ANN are available through a peculiar experiment, and otherwise it is unsupervised.

Supervised ANNs can besides be categorized into two groups viz error-based and prototype-based. The chief purpose of error-based web is to cut down the cost map which is defined on the footing of mistake between the desired end product and the web end product. The chief purpose of the prototype-based web is to cut down the distance between the inputs forms and the paradigms which are assigned to each bunch.

The Multilayer Perceptron ( MLP ) and Radial Basis Function ( RBF ) webs are the illustrations of/for error-based webs and the Linear Vector Quantization ( LVQ ) is the illustration for paradigm based web.

Multilayer Perceptron ( MLP ) web is one of the of import supervised nervous web constructions. It is a feed-forward superimposed web with one input bed, one end product bed, and some concealed beds [ 109 ] . The MLP preparation is based on the minimisation of a suited cost map, and is called the back extension algorithm. The first version of this algorithm based on the gradient descent technique was proposed by Werbos [ 110 ] and Parker [ 111 ] .

The cardinal building of a Radial Basis Function ( RBF ) web constitutes three beds with wholly different functions. The inputs bed consists of beginning nodes that connect the web to its environment. A nonlinear transmutation from the input infinite to the concealed infinite is applied in the 2nd bed ; in most applications the concealed infinite is of high dimensionality. The end product bed is additive, supplying the response of the web to the activation form applied to the input bed [ 112-113 ] .

Linear Vector Quantization ( LVQ ) was introduced by Linde et Al. [ 114 ] and Gray [ 115 ] . It was ab initio used for image informations compaction and subsequently was adapted by Kohonen [ 116 ] for pattern acknowledgment. The cardinal thought is to split the input infinite into figure of distinguishable parts, called determination parts.

Simplified Fuzzy Art map ( SFAM ) , the abridged theoretical account of fuzzed adaptative resonance theory, is a prototype-based web which can manage both binary and linear informations in a supervised mode. In add-on to the high practical potency of the SFAM web, its elaborateness prevents the others from utilizing it.

These four different ANN constructions are to foretell the malignance of the different malignant neoplastic diseases.

Wavelet Neural Network

Multilayer perceptron ( MLP ) along with the back extension acquisition algorithm is the most popular type of ANN among all in practical state of affairss [ 117 ] – [ 118 ] . However, disadvantages of an MLP are

troubles in making the planetary lower limit in a complex hunt infinite

time-consuming and

failure to meet when high nonlinearities exist,

These restrictions have deteriorated the truth of its application. To get the better of the lacks of an MLP, a Wavelet Neural Network ( WNN ) has been introduced as a critical option to the MLP [ 119 ] . Wavelet households are integrated as the activation map in the concealed bed of WNNs. There are several issues that are concerned with WNNs, changing from different larning algorithms, web architecture, type of activation maps used in concealed bed and besides the parametric quantity low-level formatting.

A proper low-level formatting of the web parametric quantity is a cardinal factor to accomplish faster convergence rate and higher truth rate. Approaches of utilizing an expressed look, hierarchal bunch, support vector machine, familial algorithm and K-Means constellating are among the attacks that have been implemented in the parametric quantity low-level formatting [ 120, 121 ] . Assorted constellating algorithms, viz. , K-Means ( KM ) , Fuzzy C-means ( FCM ) , symmetry-based K-Means ( SBKM ) , symmetry-based Fuzzy C-means ( SBFCM ) and modified point symmetry-based K-means ( MPKM ) constellating algorithms are available in initialising the WNN interlingual rendition parametric quantity. These assorted constellating algorithms can be integrated into the WNN and applied in a existent universe application, where the categorization job of heterogenous malignant neoplastic disease utilizing the microarray information is chief concern.

Probabilistic Neural Network ( PNN )

PNN was developed by Specht [ 122 ] [ 123 ] . This provides a common solution to model categorization jobs by following the probabilistic attack based on the Bayes expression. The Bayes determination theory emerged from his expression takes into history the comparative likeliness of events and uses a priori information to better anticipation. Parzen calculators are used by the web theoretical account to achieve the corresponding chance denseness maps ( p.d.f. ) to the categorization classs. Parzen [ 124 ] showed that categories of p.d.f. calculator asymptotically approach the cardinal denseness map, provided that it is uninterrupted. Cacoulos [ 125 ] extended Parzen ‘s attack to the multivariate instance.

A supervised preparation set is used by PNN to develop chance denseness maps within a form bed. Training of a PNN is much simpler than other ANN techniques. Key advantages of PNN are that developing needs merely a alone base on balls and that the determination hiper- surfaces are guaranteed to near the Bayes-optimal determination boundaries as the figure of preparation samples grows. On the other manus, the chief restriction of PNN is that all preparation samples must be stored and used in sorting new forms. But, in order to diminish the computational cost, dimensionality decrease and bunch attacks are normally applied, old to the PNN building.

The PNN-based determination attack was applied to categorise a group of persons into certain classs of diagnosing in the country of malignant neoplastic disease diseases.


A new method called automatic Computer-Aided Diagnosis ( CAD ) system is presented. This system is used for early sensing of lung malignant neoplastic disease by analysing chest 3D computed imaging ( CT ) images. In the first phase of this CAD system pure basic image processing techniques is used to pull out lung parts. The extracted lung parts in each piece are segmented utilizing Hopfield Neural Networks ( HANN ) and it show good cleavage consequences in a short clip. Fuzzy Possibilistic C-Means ( FPCM ) algorithm is presented that incorporates spacial information into the rank map for constellating.

Lung Regions Extraction

The chief restrictions of the earlier grey degree thresholding techniques are the job of choosing suited and accurate threshold values. Furthermore, some attacks, as in [ 126 ] , need a station treating measure to counterbalance the doomed parts that may happen as a consequence of utilizing the thresholding technique. To get the better of the jobs of the thresholding methods, a new method is proposed in this chapter for the automatic extraction of lung parts based on one of the different characteristics of the natural information obtained utilizing the bit-plane slice technique. The extraction attack described in this subdivision is to the full automatic and depends on a set of basic digital image processing techniques adapted to the CT information. The primary consequences obtained from the proposed algorithm to a 3D dataset consisting of 2668 2D CT images from 11 persons have been important. A CT image of chest consists of different parts such as the background, lung, bosom, liver and other organ countries. The end of lung part extraction measure is to divide the lung parts, our parts of involvement ( ROIs ) , from the environing anatomy constructions.

Lung Regions Extraction

Cleavage of lung part utilizing FPCM

Analysis of metameric lung part

Formation of diagnosing regulations

Testing and Evaluation

Figure 4.2: The Lung Cancer Detection System

Figure 4.2 clearly explains the proposed method for the extraction of the lung parts from 3D CT chest image. Initially, the bit-plane sliting algorithm [ 127 ] is applied to each 2D CT image of the natural information. The ensuing binary pieces are so analyzed to take among them the best bit-plane image that may assist in pull outing the lung parts from the natural CT-image information with a certain grade of truth and acuteness.

Original Image

Extracted lung

Bit-Plane Slicing


Median Filter



Lung Border Extraction

Flood Fill Algorithm

Figure 4.3: The proposed lung parts extraction method

To polish the chosen bit-plane image, other attacks were used for different intents in a sequence of stairss. The chief intent of Erosion, average filter and dilation stairss is to eliminate irrelevant inside informations that may add excess troubles to the lung boundary line extraction procedure. The chief end of the outlining measure is to pull out the construction ‘s boundary lines. The chief intent of lung boundary line extraction measure is to divide lung construction from all other uninteresting constructions. Finally, in order to make full the extracted lung parts with their original strengths, a stack-based flood-fill technique is used. Figure 4.4 shows the consequences of using measure by measure the proposed lung parts extraction method to a given CT image.

Calciferol: paperLung_Cancer_Classificationlung cancers.jpg

Figure 4.4: Lung parts extraction algorithm: a. original CT image, b. bit-plane-2, c. eroding, d. average filter, e. dilation, f. outlining, g. lung part boundary lines, and h. extracted lung.

Lung Regions Segmentation

After pull outing the lung parts successfully from the natural CT images, as described in the old subdivision, the 2nd measure of the proposed CAD system is lung parts cleavage that aims to section the extracted lung parts seeking for cancerous cell campaigners -the new part of involvements ( ROIs ) . A immense figure of campaigners are chosen with big figure of non-cancerous campaigners or false positives and a few Numberss of cancerous campaigners. ANNs are well-known attacks used for many intents and in many applications. ANNs are application independent and work mulct with most of the applications. The proposed attack uses the ANNs to work out the lung parts cleavage job.

There are assorted ANN techniques available which can be used for this proposed attack. But Hopfield Neural Network is used in this due to its important public presentation.

Hopfield Artificial Neural Network ( HANN )

Hopfield Neural Network ( HANN ) is one of the ANN, which has been used in many the literatures for different intents. The chief usage of Hopfield Neural Network in medical image processing field is its usage for categorization of Magnetic Resonance ( MR ) images of the encephalon based on energy minimisation as described in [ 128, 129 ] . The public presentation of the HANN is found to be important. The algorithm is enhanced to get the better of some of the jobs such as sing the minimisation of the amount of squared mistakes and guaranting the convergence of the web in a pre-specified period of clip.

Figure 4.5: Architecture of HANN

The improved version of the HANN used in [ 129 ] is used for MR images of the encephalon. The same algorithm is used for the cleavage of the extracted lung parts. Then the extracted lung parts cleavage job is formulated as a minimisation of an energy map constructed of a cost-term as a amount of squared mistakes. In order to vouch the convergence of the web, the minimisation is achieved with a measure map allowing the web to make its stableness in a pre-specified period of clip.

The HANN architecture consists of a individual bed stand foring a grid of N x M nerve cells with each column stand foring a category and each row stand foring a pel. All nerve cells work as both input and end product nerve cells at the same time. In fact nerve cells under each category hold the chance that the corresponding pel belongs to this category. N is the size of the given image and M is the figure of categories that is given as a priori information. The web is designed to sort the characteristic infinite without instructor based on the concentration of each category calculated utilizing the distance step ( Rkl ) between the kth pel and the centroid of category l. The job of cleavage is formulated as a divider of N pels among M categories such that the assignment of the pels reduces the cost-term of the energy ( mistake ) map:

( 1 )

Where Rkl represents the distance step between the kith pel and the centroid of category cubic decimeter, and defined as follows:

( 2 )

Where Xk is the characteristic value ( intensity value ) of the kth pel and Xl is the centroid value of category cubic decimeter, and defined as follows:

( 3 )

Where nl is the figure of pels in category l. Considering instance n=2 which means the energy is defined as sum-squared mistake, Vkl is the end product of the klth nerve cell. This attack adopted the winner-takes-all acquisition regulation, where the input-output map for the kth row ( to delegate a label m to the kth pel ) is given by:

( 4 )

The minimisation achieved by utilizing Hopfield nervous web ( HANN ) and by work outing a set of gesture equations satisfying:

( 5 )

Where Ui and Vi severally represent the input and end product of the ith nerve cell, I? ( T ) represents scalar positive map of clip, which determines the length of the measure to be taken in the way of the vector vitamin D = a?’

E ( V ) . The suited choice of the measure I? ( T ) is something of an art, experimentation and a acquaintance with a given category of optimisation jobs are frequently required to happen the best map [ 129 ] . It is found that the I? ( T ) map used in [ 129 ] for sectioning the MR informations utilizing HANN is used in this attack and works all right for sectioning the CT informations utilizing the HANN excessively:

( 6 )

Where T represents the loop measure and is the pre-specified convergence clip. HANN cleavage algorithm can be summarized in the undermentioned stairss:

Initialize the input of nerve cells to random values.

Use the input-output map ( Vkl ) defined supra, to obtain the new end product values for each nerve cell, set uping the assignment of pels to categories. The category rank chances grow or decrease in a winner-takes-all manner as a consequence of contention between categories. In winner-takes-all theoretical account, the nerve cell with the highest input value fires and takes the value 1, and all staying nerve cells take the value 0.

Calculate the centroid ( Xl ) as defined above, for each category cubic decimeter.

Calculate the energy map ( E ) as defined above,

Update the inputs ( Ui ) utilizing the undermentioned equation, larning occurs ; when nerve cell input weights are adjusted in an effort to cut down the end product mistake.

( 7 )

Repeat from measure 2 until t = Ts. This procedure iteratively modifies the pel label assignments to make a close optimum concluding cleavage map.

HANN Segmentation Results

HANN with the specifications mentioned above is applied to each of the extracted lung parts for the whole information set and keep the consequences for farther processing in the undermentioned stairss. HANN cleavage consequences are accurate and homogenous. In add-on to that, HANN takes short clip to accomplish the coveted cleavage consequences. HANN needs less than 120 loops to make the coveted cleavage consequences ( i.e. about 9 seconds on norm ) .

Fuzzy Possibilistic C Mean ( FPCM )

FPCM is a constellating algorithm that combines the features of both fuzzed and possibilistic c-means. Memberships and typicalities are really critical for the right characteristic of informations infrastructure in constellating job. Therefore, an nonsubjective map in the FPCM depending on both ranks and typicalities can be shown as:

( 8 )

With the undermentioned restraints:

( 9 )

( 10 )

A solution of the nonsubjective map can be obtained via an iterative procedure where the grades of rank, typicality and the bunch centres are update via:

( 11 )

( 12 )

( 13 )

FPCM produces ranks and possibilities at the same time, along with the usual point paradigms or bunch centres for each bunch. FPCM is a hybridisation of Possibilistic C-Means ( PCM ) and Fuzzy C-Means ( FCM ) which provides solution to assorted jobs.

The advantages of the FPCM method are the undermentioned:

Provides parts more homogenous than other techniques

it reduces the specious blobs

it removes noisy musca volitanss

It is less sensitive to resound than other techniques.

Features Extraction and Formulation of Diagnostic Rules

The cleavage consequences are obtained and the attack start by initial cancerous campaigner objects or nodules that represent all the members of one of the categories ensuing from the HANN cleavage algorithm. Sing the members of the category with the least figure of members as the initial cancerous campaigner objects and all the members of other categories are considered. Then, different characteristics are extracted to utilize them in the undermentioned diagnostic measure, where some diagnostic regulations are formulated to take a immense figure of false campaigners that normally consequences from the cleavage measure.

Feature Extraction

The characteristics used in the diagnostic regulations are obtained from the literature:

Area of the campaigner part

The Maximum Drawable Circle ( MDC ) inside the campaigner part

Mean strength value of the campaigner part.

It is found that, the above characteristics are suited to accomplish accurate diagnosing by experimentation. Therefore, the first characteristic ( the country of the candidate part or object ) is used to:

Eliminate isolated pels ( seen as noise in the metameric image ) .

Extinguish really little campaigner object ( Area is less than a thresholding value ) .

This characteristic by and large eliminates a good figure of excess campaigner parts that do non hold a opportunity to organize a nodule ; moreover it besides reduces the calculation clip needed in the following diagnostic stairss.

The 2nd characteristic is to stand for each campaigner part by its corresponding MDC. This method begins to pull a circle get downing from a point inside a candidate part or object. This circle should carry through the status that all the pels inside the circle belong to the object in procedure. All the pels inside the object are considered as get downing pulling point. The procedure starts to pull one-pixel radius size circle get downing from a point inside the candidate part. If the procedure succeeds, the radius is increased by one pel and attempts to redraw the circle. This procedure is repeated until the last radius exceeds the boundary line of the part. The radius size of the old drawing is recorded that fulfills the status that all the pels inside the circle belong to the object in procedure as MDC. The procedure is repeated to cover all the campaigner objects. Finally, each campaigner object saves its maximal drawable circle to be used in the diagnostic procedure to eliminate more and more false positive cancerous campaigners. The drawing of the circle is simulated inside the object procedure by analyzing the eight neighbours of the starting point. If all of adjacent pels are belonging to the same object as the get downing point object, the pulling procedure succeeds, which means that one-pixel radius size is achieved. For the two-pixel radius size look into the 24-neighbours, and so on.

The 3rd characteristic is the average CT strength value of the campaigner part and is used to eliminate more parts that do non hold characteristics of cancerous cells. The average strength value represents the mean strength value of all the pels that belong to the same part ( object ) and is calculated as follows:

( 14 )

where J denotes the object index and ranges from 1 to the entire figure of candidate objects in the whole image. Intensity ( I ) denotes the CT strength value of pel I, and one scopes from 1 to n, where n represents the entire figure of pels belonging to object J.

Formulation of Diagnostic Rules

The abovementioned characteristics are extracted and so some diagnostic regulations are formulated to utilize them in the proposed CAD system.

Rule 1: When the country of the object is below the threshold value T1 for each campaigner object, so it is deleted from the campaigner list. When this filter is applied, it has the consequence of cut downing the figure of false positives that exist in the initial campaigner objects. This determination may cut down the calculation clip needed in the undermentioned diagnostic regulations.

Rule 2: If the value of MDC of this object is below the threshold value T2, so it is deleted from the campaigner list. T2 is chosen to be 2- pels radius size, therefore any candidate objects with MDC less than 2-pixels radius size should be removed as it is off from being a nodule, and really near from being blood vas. This regulation is based on the medical fact that true lung nodules show certain disk shape particularly little lung nodules. When this filter is applied, it has the consequence of taking big figure of vass, which in general have a thin oblong, or line form.

Rule 3: For each campaigner object, if the value of the average strength of this object lies outside a peculiar scope, i.e. between T3 and T4, so it is deleted from the campaigner list. The right values are chosen for both thresholds T3 and T4 based on Medical information and experimentation. The proposed attack used the values of -9000 CT-intensity value for the threshold T3 and -12500 CT strength value for the threshold T4. The filter has the consequence of taking farther more false positives.

After all the filters are applied, really little Numberss of cancerous campaigner objects are present. The CAD system marks all the staying campaigners as possible cancerous parts. Then the images related with these parts should be reported and displayed to radiotherapists to take their concluding determination. This implies that, the intent of the proposed CAD system is non to replace radiotherapists ; but to help radiotherapists and supply them with a tool that may assist them in observing lung malignant neoplastic disease at early phases by alarming them to possible abnormalcies. Furthermore, the proposed attack aims at bettering the truth of sensing and minimising the clip spent by radiotherapists in analysing huge figure of pieces per patient ( more than 300 ) .


This chapter discuss about an automatic CAD system for early sensing of lung malignant neoplastic disease by analysing natural thorax CT images. The attack starts by pull outing the lung parts from the CT image utilizing several image processing techniques, including spot plane slice, eroding, average filter, dilation, sketching, and flood-fill algorithm. A fresh attack of utilizing bit-plane slice technique is introduced alternatively of the thresholding technique that is used in the first measure in the extraction procedure to change over the CT image into a binary image. Bit-plane slice technique is both faster and data- and user-independent compared to the thresholding technique. After the extraction measure, the extracted lung parts are segmented utilizing Fuzzy Possibilistic C Mean ( FPCM ) algorithm. The HANN algorithm shows homogenous consequences obtained in a short clip. Then, the initial lung campaigner nodules ensuing from the HANN cleavage are analyzed to pull out a set of characteristics to be used in the diagnostic regulations. These regulations are formulated in the following measure to know apart between cancerous and non-cancerous campaigner nodules. This technique is a powerful method for noisy image cleavage and works for both individual and multiple-feature informations with spacial information. The extracted characteristics in the proposed system are: the metameric lung parts, the maximal drawable circle ( MDC ) inside the part and the average pel strength value of the part.

The following chapter trades with the following proposed attack called “ A Computer Aided Diagnosis System for Lung Cancer Detection Using Support Vector Machine ” .