歡迎來到全傑科技

今日訪客: 425
線上訪客: 10

本軟體最新開課活動

目前沒有相關活動!!!
本軟體之前的活動!!
本軟體有產品說明會, 請洽本公司!!

下載專區 Download

活動資訊

  • 目前尚無任何訓練課程!!

聯絡我們

姓名:
Email:
聯絡電話:
單位:
部門:
附件:
您的留言:

提供專業軟體代購服務
如有未列於網站之產品需求
歡迎來電洽詢,感謝您!
電話:(02)2507-8298

NeuroShell Classifier 2
類神經網路分類應用軟體
Professional system that learns historical patterns to categorize or classify data.
軟體代號:7144
瀏覽次數:5442
Windows2000WindowsXPWindowsVISTAWindows7
教育版
商業版
再啟動服務
遠端展示
在地教學
遠端安裝啟動服務
原廠技術服務
產品說明會
教育訓練
教學範例檔
永久授權
目前庫存
中文型錄
安裝序號
合法保證
電子英文手冊
ESD網路下載
Neuro ConceptNeuroShell Classifier_logo

國立臺北商業大學
實踐大學
明志科技大學
高雄市立小港醫院 (委託財團法人私立高雄醫學大學經營)
中原大學
今日儀器股份有限公司
元智大學
國家原子能科技研究院
國立成功大學
國立高雄大學
國立高雄科技大學 (西校區)
大同大學
淡江大學學校財團法人淡江大學
企業管理學系
財務金融學系
工業工程與管理學系
精神學科
國際經營與貿易學系
能源與環境科學中心
材料科學與工程學系
電機工程學系
機械與自動化工程系
資訊工程學系
環境與安全衛生工程學系
運輸管理學系
Features
The NeuroShell Classifier solves classification and categorization problems based on patterns learned from historical data. The Classifier produces outputs which are the probabilities of the input pattern belonging to each of several categories. Examples of categories include {acidic, neutral, alkaline}, {buy, sell, hold}, and {cancer, benign}. The classification algorithms (one is a new neural network and the other is a statistical classifier driven by a genetic algorithm) are the crowning achievement of several years of research. These algorithms have been optimized to solve classification problems. Gone are the days of dozens of parameters that must be artistically set to create a good model without over fitting. Gone are the days of hiring a neural net expert or a statistician to build your classification models. The NeuroShell Classifier allows you to build powerful classification models quickly. Statistical tools such as an agreement matrix (sensitivity and specificity), probability graphs, ROC curves, and input rankings assist in analyzing the effectiveness of your model. Two of the most commonly heard complaints about previous classification systems, aside from being too hard to use, are that they are too slow or that they do not accurately tell you how important each of the variables is to the model. We've taken care of those problems. That's why we have two training models from which to choose. The first model called TurboProp2 dynamically grows hidden neurons and trains very fast. TurboProp2 models are built (trained) in 10 to 30 seconds on a 200 MHz Pentium (a matter of seconds on newer computers), compared to hours for older neural networks types. The genetic training method takes a little longer to train but reveals the relative importance of each of your inputs. You will know which data you don't have to collect anymore! The genetic training method also trains everything in an out-of-sample mode; it is essentially doing a 'one-hold-out' technique, also called 'jackknife' or 'cross validation'. If you train using this method, you are essentially looking at the training set out-of-sample. This method is therefore extremely effective when you do not have many patterns on which to train. The NeuroShell Classifier facilitates integration with other programs, because it uses standard text files. These files are easily imported/exported from spreadsheet programs such as Excel and Lotus®, for example. The NeuroShell Classifier is so easy to use that it doesn't need a manual! Instead, there is an 'Instructor' that guides you through making the classification models. At every stage of the Instructor, our extensive help file will give you all the information you need. When you have learned from the Instructor, you can turn it off and work from the toolbar or menus. (The program does include an on-line manual that you may print yourself or just browse from your computer.) The NeuroShell Classifier displays an ROC curve to help you analyze the effectiveness of your model. Finally, for those who want to embed the resulting neural models into your own programs, or to distribute the results, there is an optional Run-Time Server available. Classifier models may be distributed in your programs without incurring royalties or other fees. The Neuroshell Classifier shows you the estimated relative importance of each variable in the model. -------------------------------------------------------------------------------- Specifications Software Requirements The NeuroShell Classifier is a 32-bit program that requires Microsoft® Windows®, 98, Windows 2000, XP®, or Windows NT®(SP3 or higher). It will not run with Windows 3.1. Hardware Requirements IBM® PC or compatible computer with a 486 or higher processor and 16 megabytes of RAM. Limits 150 input variables and one output variable (with multiple categories). 16,000 rows of data (example patterns). Note: These limits are not inhibitive as they may seem for owners of large databases. Call for explanation Files ASCII text files separated by commas, spaces, tabs, or semicolons. If your data is in a spreadsheet, simply save it as a .CSV file. Speed Neural nets train very fast, usually in under a minute. The Genetic method will train very slowly on large files. This method may be more suitable for less than 3,000 rows of data. Statistics and Graphics Number of correct and incorrect classifications. Actual vs Predicted. Receiver Operating Characteristic (ROC) curves. Agreement matrix (sometimes called confusion matrix or contingency table). Methodology There are two neural network paradigms. One is a proprietary algorithm called TurboProp(TM) 2, which is NOT based on the old backpropagation algorithm. Another paradigm in the software uses an advanced variant of Probabilistic Neural Nets (PNN). -------------------------------------------------------------------------------- Features Ability to Select Level of Generalization in Neural Training Strategy Our neural method can actually be changed after it is trained so that it provides more or less generalization. Pressing the Advanced Button will allow you to select the level of generalization from 0% (No Enhanced Generalization) to 100% (Over Generalization). A setting of 50% is equivalent to Enhanced Generalization. The default value, when the Enhanced Generalization button is checked, is 50%. Maximum number of hidden neurons for Neural Training Strategy You may set the number of hidden neurons to a maximum of 150 when using the Neural Training Strategy. This gives you some control over how the neural net fits data. You may even specify zero hidden neurons for a linear model. Maximum number of generations without improvement in Genetic Training Strategy You may set the maximum number of generations without improvement that the algorithm will train on. The number of generations may be set between 10 and 1000 (integers only). This will allow you to control the length of training time. Fitness Coefficient Matrix When using the genetic training strategy, the user has the option to change the goal of genetic optimization. The goals are to minimize the total number of incorrect classifications, minimize the average percentage of incorrect classifications over all categories and to maximize the custom fitness function built with the user defined Fitness Coefficients Matrix. For example, a physician may want a model that minimizes false negatives rather than treating all wrong answers the same. Agreement Matrix (Contingency Table) Statistics The agreement matrix statistics related to the comparison of the actual and predicted classifications include the following: The category under consideration is positive. True-pos. ratio (True-Positive Ratio, also known as Sensitivity) is equal to the number of patterns classified as positive by the network that were confirmed to be positive, divided by the total number of confirmed positive patterns. It is also equal to one minus the False-Negative ratio. False-pos. ratio (False-Positive Ratio) is equal to the number of patterns classified as positive by the network that were confirmed to be negative, divided by the total number of confirmed negative patterns. It is also equal to one minus the True-Negative ratio. True-neg. ratio (True-Negative Ratio also known as Specificity) is equal to the number of patterns classified as negative by the network that were confirmed to be negative, divided by the total number of confirmed negative patterns. It is also equal to one minus the False-Positive ratio. False-neg ratio (False-Negative Ratio) is equal to the number of patterns classified as negative by the network that were confirmed to be positive, divided by the total number of confirmed positive patterns. It is also equal to one minus the True-Positive ratio. When the category under consideration is negative, the terms are reversed The NeuroShell Classifier solve
---
National Taipei University of Business Department of Business Administration
Shih Chien University Department of Finance
MingChi University of Technology Department of Industrial Engineering & Management
Kaohsiung Municipal Siaogang Hospital Department of Psychiatry