A support vector machine model for contractor prequali fication
Ka Chi Lam,Ekambaram Palaneeswaran ⁎,Chen-yun Yu
Dept.of Building and Construction,City University of Hong Kong,83Tat Chee Avenue,Kowloon Tong,Hong Kong
a b s t r a c t
a r t i c l e i n f o Article history:
Accepted 15September 2008Keywords:
Pre-quali fication Contractor lection Procurement
Support vector machine Arti ficial neural network
In complex and high value projects,prequali fication is crucial for both contractors and clients,as it targets towards best value delivery through quali fication safeguards and streamlined competition among potential candidates.Due the complex nature of the procurement problems such as prequali fi
cation exercis,the robust models are rarely attempted.The rearch reported in this paper prents an overview of potential suitability of Support Vector Machine (SVM)method for contractor/consultant prequali fication transactions in the construction project procurements.Furthermore,the performance of SVM is compared with speci fic arti ficial neural network outcomes.The results obtained from practical datats indicate encouraging potentials for SVM applications in the procurement problems such as prequali fication and contractor lection.Hence,a SVM-bad decision support framework is propod.
©2008Elvier B.V.All rights rerved.
1.Introduction
‘Right ’lection of suitable contractors and consultants is critical for achieving good project performance and overall success in construction projects [1–6].Several pre-lection practices are being followed in the construction industry and common strategies include:(a)open com-petition;(b)identi fication bad on reputation/references;(c)short-listing from registered/licend/prequali fied lists;and (d)project-speci fic prequali fication [7].Selecting a ‘right ’pool of bidders/propors is a paramount task for many clients in this industry [8,9].In general,prequalifying eligible bidders/propon
ents is regarded as a vital safe-guard for construction clients,especially in major/high value projects [10,11].The generic bene fits of contractor prequali fication exerci include healthy competitions,minimized risks,and improved quality potentials.The can be more prominent in non-traditional procure-ment arrangements such as Design-Build and Build-Operate-Transfer,which mostly involve higher risks and substantial resources [12–15].
Earlier,veral rearchers have delved differently to identify and establish arrays of uful criteria for contractor prequali fi[1,6,8,16–18].Conquently,numerous models/systems ranging from systematic linear frameworks [12,17,19–21]to complicated arti ficial intelligence applications and decision support systems were con-ceived to enhance the prequali fication [22–29].However,due to the inherent complexities such as multi-criteria con-siderations,preci mathematical models are ldom considered for the contractor lection problems.
Recently,robust modeling with support vector machine (SVM)is gaining signi ficant popularity for various advantages including powerful generalization performance and hence a plethora of applications are being logical modeling [30],evaluation of consumer loans [31],studying credit rating systems [32],bank performance pre-diction [33],bankruptcy predictions [34],financial forecasting [35,36].Some such applications in the construction engineering and manag
e-ment domains include slope reliability analysis [37],studying ttle-ment of shallow foundations [38],supply chain demand forecasting [39],model induction [40],document classi fication for information systems [41]information integration and situation asssment [42]and conceptual cost estimates in construction projects [43].Drawing threads from such encouraging studies,SVM bad robust modeling for the contractor prequali fication problem was considered in this rearch study and the main outcomes are prented in this paper.2.Theory of support vector machines
The principles of support vector machine (SVM)are bad on the structural risk minimization and statistical learning theory [44].In general,the SVM technique mainly us linear models to classify a data sample through speci fic nonlinear mapping into relevant high dimensional feature space [36].Similar to the neural network bad techniques,the SVM bad modeling also involves training and testing of data instances such that the training t compri of target outcome variable(s)by mapping veral predictor variables.The advantages of SVM include strong inference capacity,generalization ability,fast learning capacity,and ability for accurate predictions.For example,the generalization ability of SVM learning mainly depends on the capacity and basic reliance on space dimensionality and other parameters such as upper bound and kernel parameters.In contrast,the performance of arti ficial neural network rely on veral controlling parameters such as
Automation in Construction 18(2009)321–329
⁎Corresponding author.
E-mail address:bckclam@cityu.edu.hk (K.C.Lam),palanees@graduate.hku.hk (E.Palaneeswaran),chenyunyu3@cityu.edu.hk (C.
Yu).
0926-5805/$–e front matter ©2008Elvier B.V.All rights rerved.doi:
10.1016/j.autcon.2008.09.007
Contents lists available at ScienceDirect
Automation in Construction
j o u r na l ho m e p a g e :w w w.e l v i e r.c o m /l o c a t e /a u t c o n
the number of hidden layers,the number of hidden nodes,the learning rate,the momentum term,epochs,transfer functions and weights initialization.Moreover,balancing an optimal combination of tho parameters for good neural network predictions is a challenging task in veral ttings.Following ction provides some uful collated synopsis on SVM from [36,37,43,45,46].
If a training t is considered such that D Training ={x i ,y i }i =1N in which ‘x i ’reprents a t of input vectors and ‘y i ’refers to target labels of ‘−1’for class 1or ‘+1’for class 2.
In this,x i =(x i 1,x i 2,...,x i n )∈R
n
y ¼
¼1if x i in class 1¼−1if x i in class 2推广方式都有哪些
&As per binary classi fication ca,y i w T /x i ðÞþb z 1ÀÁ
i ¼1;N ;N
In this,‘b ’reprents the bias element and ‘w ’refers the weight vector such that,w d x þb
z 1
i f y i ¼1V −1
if y i ¼−1
迎春花的样子&If the hyper-plane is reprented by y =w T x +b ,then the distance between (w ·x +b =1)and w d x þb ¼−1ðÞis
2jj w jj ¼2
ffiffiffiffiffiffiffiffiffiffiffiffiffiffiw T d w
p :With Lagrange transformation considerations,the objective function can be reprented as min f αðÞ¼∑n
i ¼1
αi −12∑n
i ¼1∑n
j ¼1
αi αj y i y j x i x j ,subject to ∑n婚礼主持流程
i ¼1
αi y i ¼0;and αi ≥0,i =1,2,…n ;here,αis the non-negative Lagrange multiplier and w ¼∑n
i ¼1
αi y i x i .
As most of the classi fication problems are linearly non-parable
instances,a slack variable (also known as relaxation variable)‘ξi ’is introduced such that the optimization problem is changed as follows:Min w ;b ;n w T w 2þC ∑N i ¼1
n i
!In this,C is the penalty parameter of the error term;y i (w T ·x i )+b ≥1−ξi ;
i =1,...N and ξi ≥0.For easing the computations,kernel functions are in-troduced and the nonlinear classi fier function for non-parable cas can be determined using the following equation:y x ðÞ¼sgn ∑N i ¼1
y i αi K x i ;x j ÀÁ
þb
In general,various kernel function types are considered in SVM models.In this rearch,three popular non-linear kernel functions are considered for SVM modeling such as:1.Polynomial kernel function k x i ;x j ÀÁ¼γx T i x j þr À
Ád 2.Radial basis kernel function k x i ;x j ÀÁ
¼exp −γjj x i −x j jj 2 3.Sigmoid kernel function k x i ;x j ÀÁ¼tanh k 1x i x j þk 2
ÀÁIn the,d ,r ∈N ,γ∈R +,k 1,k 2are taken as constants.地下城觉醒
3.Overview of the rearch design 3.1.Background
The SVM-bad prequali fication rearch initiated by the first author has been subquently explored as a part of the cond author led ongoing rearch project entitled “PICK-SMART —An Integrated Source Selection System for Construction Procurement ”,which is supported by the Hong Kong Rearch Grants Council.In this rearch project,an umbrella term of ‘source lection ’is ud to reprent the lection of various parties in construction projects such as constructors,consul-tants,suppliers/vendors,design-builders,and management contractors.This PICK-SMART rearch is mainly targeted for developing integrated systems and improved source lection frameworks.The rearch approach include knowledge-mining through literature reviewing,surveying and interviewing,knowledge discovery from data-mining of archived datats/records,ca studies,modeling with simulations,developing uful model/frameworks and decision-support system.Initial explorations in this rearch revealed that relatively little has been explored on using and integrating robust models with best value procurement systems.
3.2.Datats and rearch constructs
Three datats were arranged for this rearch study as detailed below:
(i)Data-I:a t of hypothetically simulated data with practitioner
inputs,which were ud for building pilot models to make preliminary explorations regarding suitability of SVM applica-tions for the contractor prequali fication problem;
(ii)Data-II:a t of normalized practical datats collected from
real prequali fication cas for verifying the practical relevance of propod SVM models;
(iii)Data-III:recollecting previously documented datats from
other practices elwhere for further checking the generalizing potentials of propod frameworks.An overview of rearch constructs of this study and respective data arrangements are summarized in Table 1.
‘Data-I ’compri of ten input variables (in ‘0to 5’scale),one binary output variable,and 105hypothetic
al datats (i.e.90datats for SVM training and 15datats for testing).The t of input variables identi fied through extensive knowledge-mining are:financial strength,past per-formance,past experience,human resources,equipment resources,safety and health aspects,environmental considerations,quality man-agement potentials,current workload and management capacity,and claims history.Bad on a pudo-random number generation technique with STATISTICA ™software,the input variables of ‘Data-I ’(i.e.1050variables for 105datats)were hypothetically generated and corre-sponding prequali fication outputs were simulated through consultation with a domain expert.Table 2portrays a sample extract of hypothetical datats arranged in ‘Data-I ’.
‘Data-II ’included venty four datats from six recent construc-tion projects in China that employed contractor prequali fication.The projects are medium sized public projects procured by traditional ‘design-bid-build ’route,which include 2power plant projects,2railway projects and 2mixed u buildings.Among the,the budget of smallest project is 98million RMB (1US Dollar=7.2RMB approxi-mately).Table 3prents basic details of 74datats in ‘Data-II ’.
The input criteria ud for modeling the prequali fication problem with ‘Data-II ’are:(i)past performance reference,(ii)past experience strength,(iii)financial strength,(iv)current workload,(v)human resources capacity,(vi)technical capability,(vii)quality potentials,and (viii)environment,healt
h and safety performance potentials.Basically,the criteria values (e.g.for technical capability)were derived from
322K.C.Lam et al./Automation in Construction 18(2009)321–329
respective asssments of related sub-criterion ts.For brevity and simplicity reasons,the sub-criterion level details were not considered in the current modeling exerci reported in this paper.In the SVM-bad prequalification modeling of‘Data-II’,the asssment values for all the eight input variables were bad on a normalized5-point numerical as in[26,29].Similarly,the SVM-bad prequali-fication modeling outcomes are considered in a binary ‘1’reprents‘prequalified’status and‘0’reprents‘not prequalified’result.
‘Data-III’is bad on a published ca study of neural network application for contractor prequalification that ud84datats and 11criteria from some local authority project procurements in UK[26]. In that rearch,74training cas(including37prequalified results) and10testing cas(of which5prequalified results)were ud and corresponding neural network performance was demonstrated.In this neural network modeling,normalized a5-point scale of‘1to5’for the eleven input variables and a binary‘0or1’scale for the output result)were ud and the datats a
s well as neural network outcomes were also published in[26].The UK bad -condary datats were derived as‘Data-III’for further checking and generalization of propod SVM models.
4.Support vector machine bad robust modeling
As per specific details available in www.kernel-machines. org/,potential software for SVM modeling include:LIBSVM,mySVM, and SVMLight.In this rearch,the LIBSVM tools(such )available at u.edu.tw/~cjlin/ libsvm/were ud for the SVM-bad prequalification modeling.In addition,some facilitative batchfiles svm_train.bat and svm_predict.bat)were developed and customized by the local rearch team,mainly to t the model parameters and conveniently arranging inputs and extracting the corresponding SVM model outcomes.The procedure of SVM-bad prequalification modeling adopted in this rearch is portrayed in Fig.1.
The key tasks of this robust modeling exerci include:(a)arranging the datats for modeling,(b)tting the parameters and modeling, (c)examining model outcomes and refining,and(d)consolidating re-sults for potential applications.Following summary provide basic details of this LIBSVM-bad SVM modeling:
4.1.Arranging the datats
Using suitable random lection basis,the normalized datats for a specific modeling ‘Construct2’)were arranged in two parts as training t and test t.For example,105hypothetical data-ts of‘Data-I’(for‘Construct1’)were randomly split such that ninety datats were chon for the training t and remainingfifteen datats constituted corresponding test t in this pilot modeling exerci.A similar approach was considered for grouping‘Data-II’whereas‘Data-III’grouping adopted the prior arrangement as in[26]. The format for arranging testing/training datats in the LIBSVM bad modeling framework is as below:
b label N b index1N:b value1N b index2N:b value2N…b index x N:b value x N
4.2.Setting the modeling parameters
In addition to arranging requisite datats,the robustness of SVM modeling outcomes are significantly bad on relevant parameter ttings and subquentfiltering verification of predication accuracies for optimal model choice[47].The modeling parameters include:‘s’(ssion or classification identifier),‘t’(kernel function iden-tifier),and‘C’(boundary parameter constant for which the value may range from1to1000).Some sample parameter ttings are prented in the‘Appendix’s
ection of this paper.Although some uful sug-gestions are found in the literature,there is no standard guidance for the lection of SVM kernel functions and parameters[36].Initial ttings and subquent adjustments for modeling parameters were organized similar to[47,48].Various kernel parameters and the upper bound C were adjusted to investigate the prediction performance of each model design.Especially,for the polynomial kernel,the param-eters such as s,d,and c(constant)were decided;and for the sigmoid kernel,the parameters such as k1and k2were decided.Conveniently, the model parameter ttings were adjusted through customizing
Table2
A sample extract of hypothetical‘Data-I’for pilot modeling explorations
Training data Input variables Prequalification
Result
Output
variable
O
I1I2I3I4I5I6I7I8I9I10
155********Prequalified1 24555454454Prequalified1 35554344442Prequalified1 43222220110Not prequalified0 54333321221Not prequalified0 63432521202Not prequalified0 74522230111Not prequalified0 82343121222Not prequalified0 95535535454Prequalified1 105544344442Prequalified1 115543343442Prequalified1 125444543343Prequalified1Table3
Summary of prequalification datats in‘Data-II’with practical cas
Project ID Total number of
contractors applied
for prequalification
Number of
contractors who
were successfully
prequalified
Number of
unsuccessful
contractors who were
not prequalified Power_Plant_1835
可爱的企鹅Power_Plant_2954
Railway_work_116610
Railway_work_2853
Mixed_building_118810
Mixed_building_21569
Table1
A summary of rearch constructs and data arrangements for SVM modeling
Summary of rearch constructs Summary of data for modeling Summary of split Construct ID Main purpo Data ID Key details Training Testing
Construct1Pilot modeling Data-I105hypothetically simulated datats,10input variables
(6-point numerical scale,normalized)and1output
variable(binary scale)90datats
(infive folders)
15datats
(in one folder)
Construct2Modeling practical datats and
verification of optimal design Data-II74practical datats from recent cas,8input variables
(5-point numerical scale,normalized)and1output variable
(binary scale)
61datats
(infive folders)
13datats
(in one folder)
Construct3Further validation and comparison
with neural network outcomes
Data-III84practical datats from condary source,11input variables
(5-point scale,normalized)and1output variable(binary scale)
74datats
(infive folders)
10datats
(in one folder)
323 K.C.Lam et al./Automation in Construction18(2009)321–329
SVM training batch files by editing them in a suitable text editor platform (e. ).
4.3.Clustering the datats by ‘n-fold cross-validation ’
The ‘n-fold ’cross-validation was applied for SVM-bad modeling of the prequali fication he datats of a speci fic modeling construct were clustered into ‘n −1’folders for training ts and one folder of test t.The critical considerations for deciding on the parameter ‘n ’include adequacy of training data and adequacy of test data.For every modeling construct (i.e.‘Construct 1’‘Construct 2’,and ‘Construct 3’),the corresponding datats were randomly parated into speci fic ‘n folders ’such that the training data included in (n −1)folders and test data were arranged in the remaining folder.For example,the available datats of ‘Data-II ’(i.e.74prequali fication cas)were randomly clustered such that sixty one datats (of which 27with ‘prequali fied ’result and 34with ‘not prequali fied ’outcome)were for training t in five folders and the remaining th
irteen data-ts (of which 6were ‘prequali fied ’and 7cas were ‘not prequa-li fied ’)were arranged in a parate test t folder.4.4.Model training
The SVM trainings for all the modeling constructs were carried out by duly arranging respective datats and performing initial customi-zation of related svm_train.bat file.The format of batch file operations is:Svmtrain:svmtrain [options]training_t_file [model_file]N [trainin-
g_output_file]and a sample is prented in the Appendix ction.For example,following are the training stepwi details for radial basis kernel function SVM modeling of ‘Construct 2’with ‘Data-II ’:1.The available datats of ‘Data-II ’were systematically tout by random lection into two parts for training (i.e.‘ ’file)and testing (i.e.‘ ’file).As mentioned earlier,the input variables are normalized with 5-point numerical scale basis and the output result is scaled into the binary format.With initial ttings as t =1and C =1,training of the radial basis kernel SVM model was conducted,and the optimal parameter values for the parameter γwas determined.2.Subquently,the filtered optimal parameter values were fed into the training model with the n-fold cross-validation to obtain respective accuracy rate outcomes.Then the parameters were adjusted to a certain range around the optimal value (i.e.by certain increment or decrement)to study the corresponding effects on accuracy rates.If an adjustment yielded better optimization,then the better value was chon for n
ext iteration step and otherwi such change would be ignored.For example,the value of C ranging from 1to 1000was t at different levels to obtain an optimal value and the step length of test was t as 2n (where n a N ).
3.Corresponding outcome accuracy rates were recorded from each step and then requisite comparisons were carried out to choo optimal values for the SVM parameters at acceptable accuracy values.However,it should be noted that higher support vector values might be the resultant of over-fitting problem,which would subquently yield poor prediction
outcomes.
Fig.1.SVM-bad prequali fication model.
祸水夭夭324K.C.Lam et al./Automation in Construction 18(2009)321–329
4.5.Sample snapshots of SVM training with‘Data-II’construct using different kernel functions
With a‘6-fold’classification of‘Data-II’and using polynomial kernel function for SVM =1),the parameters‘C’,and‘d’were initially t as(1,1)and the corresponding prediction accuracy was90.32%.Then,by keeping the value of‘C’,the parameter‘d’was ret at different values with an increment of1). Corresponding accuracy values of this modeling were:90.32%,95.16%, 88.71%,67.74%,53.22%,50%,51.61%,50%,50%,and48.39%respectively. Thus,the optimal value for parameter‘d’in this modeling was noted as‘2’(i.e.with the highest accuracy of95.16%).Similarly,the t of using2n as step length)explored such optimal accuracies for various ttings of parameter‘C’.In this SVM training exerci,the optimal accuracy rates of polynomial kernels were obrved to be decreasing with the increa of the parameter‘C’and remained stable after C=28.Finally,with the accuracy of95.16%,the
optimal values for parameters d,C in this design were noted as2and1 respectively.With a similar approach of parameter eding and adjustments,the optimal values corresponding to best prediction accuracies were noted for the SVM training with(a)radial basis kernel function and(b)sigmoid kernel function.Table4prents a comparison of optimal outcomes for different kernel functions in this modeling of‘Construct2’with‘Data-II’.
4.6.Testing,examining predictions and consolidating results
Subquent to the SVM trainings,the t of trained models were tested with designated test datats and corresponding accuracy rates were checked.In each ca,the respective batchfile of svm_predict.bat was ud to run and obtain prediction outcomes in respective files.The accuracy levels of SVM prediction outcomes were compared using indicators such as(i)accuracy percentage(classification),(ii)mean squared error(regression),and (iii)squared correlation coefficient(regression).After comparing all prediction outcomes,the most accurate prediction result values for every modeling construct were identified and corresponding test t outputs were extracted from respective files.
计算机基础知识与基本操作5.Overview of modeling outcomes and comparisons
5.1.Overview of pilot modeling outcomes with hypothetical‘Data-I’
In this pilot modeling construct with105hypothetical datats,the optimal prediction accuracy values of SVM training outcomes corresponding to three kernel functions such as polynomial,radial basis,and sigmoid were91.1%,92.2%and83.3%respectively.Similarly,the accuracy values of SVM testing outcomes for the three kernel polynomial,radial basis,and sigmoid)are73.3%,93.3%, and86.7respectively.Among the,the radial basis function kernel design yielded best accuracies in both training and test ts.
5.2.Overview of SVM modeling outcomes with practical datats of ‘Data-II’
As detailed in the previous ction,the SVM training outcomes of ‘Data-II’construct using the three non-linear kernel functions are:(a) polynomial kernel function accuracy=95.16%;(b)radial basis kernel function accuracy=95.16%;and sigmoid kernel function accu-racy=93.55%.Similarly,corresponding performance of SVM testing accuracies with polynomial,radial basis and sigmoid kernels are 23.08%,92.31%and91%respectively.Although both polynomial kernel and radial basis kernel design yielded higher training accuracies,the polynomial kernel performance was unacceptably poor in the SVM testing whereas the radial basis kernel was noted as the most preferable.
5.3.Overview of SVM modeling outcomes with‘Data-III’for further comparisons
Earlier [24,26,27]identified some potential applica-tions of artificial neural networks for contractor prequalification problems.For example,[26]prented a ca study of neural network application for contractor prequalification that ud practical datats from UK bad local authority project procurements.‘Data-III’is bad
Table4
Comparison of different kernel accuracies of SVM modeling with‘Data-II’
Parameter‘C’Polynomial kernel
function accuracy
(bad onγ=0.05)Radial basis kernel
function accuracy
(bad onγ=0.05)
Sigmoid kernel function
accuracy(bad on
k1=0.083333;k2=0)
2095.16%95.16%88.71% 2193.55%93.55%90.32% 2295.16%93.55%93.55% 2395.16%93.55%91.94% 2495.16%93.55%93.55% 2595.16%91.94%85.48% 2691.94%90.32%83.87% 2780.64%82.26%75.81% 2877.42%82.26%74.19% 2977.42%82.26%75.81% 21077.42%82.26%75.81% 21177.42%82.26%75.81% 21277.42%82.26%75.81% 21377.42%82.26%
75.81%
Fig.2.Selecting suitable parameter value for gamma(γ
).
Fig.3.A comparison of neural network(ANN)and SVM prediction.
325
K.C.Lam et al./Automation in Construction18(2009)321–329
时间范围
on condary datats from[26],which were considered for further validation of SVM-bad prequalification modeling construct in this rearch.Also,uful comparisons were made with the neural network predictions of[26].Among three non-linear kernel functions considered for SVM modeling in this construct,the performance of radial basis kernel function is again found as better suitable for the prequalification problem.Fig.2portrays the accuracy rates for different ttings of the SVM parameterγand its optimal value was noted as0.4with respect to highest accuracy of95.8904%.
As discusd earlier,the radial basis kernel SVM training was continued withγ=0.4and other parameters were adjusted.Finally, the accuracy of95.8904%was confirmed as best optimal value with parameter C=1for20.Fig.3portrays a comparison of radial ba kernel SVM test results with corresponding neural network test t outcomes as per[26].In summary,the SVM predictions in this c
onstruct with‘Data-III’also yielded encouraging outcomes and the prediction accuracies of the optimal SVM model with radial basis kernel design is better than the corresponding neural network outcomes of[26].
6.Discussions and proposal for SVM-bad decision support framework
The rearch extracts prented in this paper do not primarily focus on improving the prevailing prequalification decision-making designs such as criteria and scoring methods.Mainly,this paper highlighted promising application potentials of SVM-bad decision support for the contractor prequalification problems.In this rearch exerci,LIBSVM software[45]was ud to build the models and explore such potential applications.From the comparison of SVM modeling outcomes in this rearch,the radial basis kernel design ems to be more suitable for the contractor prequalification in construction project procurements.However,the reportedfindings are bad on modeling with limited datats and specific criteria ts in normalized formats.A broader study would be valuable for further extensions of suchfindings for more generalized recommendations. Moreover,arranging adequate datats for robust modeling of procurement problems is often difficult due to various inherent challenges such as confidentiality,nsitivity,project-specific unique-ness and one-off arrangements.For example,some key characteristics of one-off and periodical prequalificatio
n practices were highlighted in[7].However,adopting suitable standardized in terms of criteria consideration and normalized evaluation)among veral projects could be deemed as supportive for extending this SVM proposal to one-off prequalification tasks.
In the SVM models,the learnt information is embedded and less explicit for direct comprehension and hence corresponding decision support might be criticized as resultants of‘black-box’tools[30]. Moreover,basic reliance of SVM outcomes on model training requirements can be a significant limitation in certain cas as the lection variables and decision-making designs might according to project-specific requirements).In addition,prequalifica-tion is mainly practiced in large/complex public ctor project procurements,which are often governed by veral bureaucratic requirements including transparency and de-briefing arrangements to the unsuccessful applicants.Hence,additional reinforcements such as knowledge and training of relevant parties involved in the procure-ment transactions might be necessary.Furthermore,high level promotion from key stakeholders and effective measures for over-coming cultural inertia are necessary.Becau,the SVM-bad models could be deemed as uful for building valuable decision support tools in the optimal lections of best value bad procurement exercis.
In order to consider the distinctions of SVM,comparisons with different prequalification models were
as in[49].For example,the key comparisons of analytic hierarchy process,artificial neural networks and fuzzy t theories revealed that(a)the analytic hierarchy process can be employed as hierarchical decision models whereas main drawbacks include pair-wi comparison
shortcomings
Fig.4.A propod model of decision support system for contractor prequalification.
326K.C.Lam et al./Automation in Construction18(2009)321–329