La descarga está en progreso. Por favor, espere

La descarga está en progreso. Por favor, espere

Presented by: Rafael O. Batista Jorge Wall Following Robot Path Planning Problem: A Statistical Learning Approach Universidad de Puerto Rico Recinto Universitario.

Presentaciones similares


Presentación del tema: "Presented by: Rafael O. Batista Jorge Wall Following Robot Path Planning Problem: A Statistical Learning Approach Universidad de Puerto Rico Recinto Universitario."— Transcripción de la presentación:

1 Presented by: Rafael O. Batista Jorge Wall Following Robot Path Planning Problem: A Statistical Learning Approach Universidad de Puerto Rico Recinto Universitario de Mayagüez ININ6048 Final Project Presentation Mayagüez, 2016

2 Agenda  Problem overview  Dataset description  Methods  Preprocessing  Selected machine learning algorithms  Performance measures  Stacked learner  Results  Preprocessing  Selected machine learning algorithms  Performance measures  Stacked learner  PCA vs LDA  Conclusions

3 Problem overview  Automatic robot path planning decision  Multiple type of applications  Learning from environment (Sensor measurements)  Different types of approaches o Heuristic approaches (Fuzzy logic) o Geometric base approaches (Optimization based map search) o Machine learning algorithms (Supervised and unsupervised)  Open research problem o Real time execution o Dynamic environments o Unknown environments Image by Simeon87, distributed under a CC-BY 2.0 license.CC-BY 2.0 license

4 Problem overview  SCITOS G5 robot Photograph by MetraLabs.

5 Dataset description VariableDescriptionTypeValuesPredictorMissing Values UDS1 Position Angle = 180º Quantitative (0.400,0.401,0.402), (5.000,4.866, 4.860) YES0% UDS3Position Angle = -150º Quantitative (0.470,0.471,0.492), (5.029,5.028, 5.026) YES0% UDS6Position Angle = -105º Quantitative (1.114,1.115,1.118), (5.005,5.000, 4.980) YES0% UDS9Position Angle = -60º Quantitative (0.836,0.854,0.861), (5.000,4.956, 4.955) YES0% UDS12Position Angle = -15º Quantitative (0.778,0.779,0.780), (5.000,4.992, 4.981) YES0% UDS15Position Angle = 30º Quantitative (0.495,0.496,0.497), (5.000,4.921, 4.920) YES0% UDS18Position Angle = 75º Quantitative (0.354,0.355,0.356), (5.000,4.608, 4.591) YES0% UDS21Position Angle = 120º Quantitative (0.380,0.381,0.382), (5.000,4.822, 4.812) YES0% UDS24Position Angle = 165º Quantitative (0.377,0.380,0.381), (5.000,4.871, 4.865) YES0% Move ForwardPath Following Decision Qualitative 2205 (44.41%)NO0% Sharp Right TurnPath Following Decision Qualitative 2097 (38.43%)NO0% Slight Left TurnPath Following Decision Qualitative 328 (6.01%)NO0% Slight Right TurnPath Following DecisionQualitative826 (15.13%)NO0%  Twenty four sensors (Quantitative)  Four classes

6 Methods  Preprocessing  Multivariate normality test  Henze-Zirkler’s Multivariate Normality Test  Mardia’s Mutivariate Normality Test  Chi square Q-Q plot  Outlier detection  Mahalanobis distance (Assumes MVN dataset)  Random forest instance proximity (No normality assumption)  Collinearity and dimensionality reduction  Principal components analysis o Correlation matrix (scaling) o Explained variance (number of principal components) o Unsupervised  Linear discriminant analysis o Assumes normality o Class separation o Supervised

7 Methods  Selected machine learning algorithms ?  Random forest  Non parametric and supervised  Ensemble (Bagging)  k-NN  Non parametric  Supervised

8 Methods  Selected machine learning algorithms  Support Vector Machine  Cost  Gamma  Kernel Trick  Non linear  Different types of kernel: o Linear o Polinomial o Radial

9  Selected machine learning algorithms Methods  Support Vector Machine Cost = 100Cost = 0.1

10  Performance measures  Classification error  Kappa statistics  Multiclass Logarithmic Loss Methods  Desirability Function

11  Selected machine learning algorithms Methods  Stacked learner Prob. KNN Neighbors = 9 Prob. RF N.Trees = 100 Prob. SVM Degree = 3 Dataset Class Labels PCA Rotated Dataset CV RF Result

12 Results  Preprocessing: Multivariate normality test.

13 Results  Preprocessing: Outlier detection

14 Results  Preprocessing: PCA Scree plot and explained variance PC1PC2PC3PC4PC5PC6PC7PC8PC9PC10PC11PC12PC13PC14PC15 Standard deviation 2.151.851.351.261.151.081.041.000.880.850.800.790.770.730.72 Proportion of Variance 0.190.140.080.070.05 0.040.03 0.02 Cumulative Proportion 0.190.330.410.480.530.580.630.670.700.730.760.780.810.830.85

15 Results  Preprocessing: LDA Histogram First linear discriminant loading’s histogram.

16 Results  Preprocessing: LDA Histogram Second linear discriminant loading’s histogram.

17 Results  Preprocessing: LDA Histogram Third linear discriminant loading’s histogram.

18 Results  Tuning: RF LDA loadings dataset tuning

19 Results  Tuning: RF PCA rotated dataset tuning

20 Results  Tuning: SVM with LDA loadings dataset (radial kernel)

21 Results  Tuning: SVM with PCA rotated dataset (polynomial kernel)

22 Results  Tuning: SVM with PCA rotated dataset (polynomial kernel)

23 Results  Tuning: Summary table for SVM DatasetKernelDegreeCostGamma Resulting Error PCA rotated RadialNA100.50.1082 PCA rotated PolynomialThird0.10.50.1049 LDA loadings RadialNA 100 20.2515

24 Results  Implementation: PCA rotated dataset and kNN

25 Results  Implementation: PCA rotated dataset and RF

26 Results  Implementation: PCA rotated dataset and SVM

27 Results  Implementation: Dataset for stacked approach

28 Results  Implementation: PCA rotated dataset stacked learner

29 Results  Implementation: LDA loadings dataset and kNN

30 Results  Implementation: LDA loadings dataset and RF

31 Results  Implementation: LDA loadings dataset and SVM

32 Results  Implementation: LDA loadings dataset and stacked learner PCA

33 PCA vs LDA  Comparing kNN performance PCA LDA

34 PCA vs LDA  Comparing RF performance PCA LDA

35 Conclusiones  The best performance of our proposal was obtained by using RF and the LDA loadings dataset, with a desirability value of 94.75%.  The stacking approach was successfull, and improved the desirability value to 99.99% for both cases, LDA and PCA.  Machine learning techniques are useful for this kind of path planning problems with dynamical or unknow enviroments.  Future work may considered the use of better optimization strategies for the selection of optimal parameter for SVM.

36 References Fletcher, T. (2009). Support Vector Machines Explained. Retrieved from www.cs.ucl.ac.uk/staff/T.Fletcher/ Freire, A. Veloso, M. Barreto, G. (2010). UCI Machine Learning Repository [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Irvince, CA: University of California, School of Information and Computer Science. Freire, A. L., Barreto, G. A., Veloso, M., & Varela, A. T. (2009). Short-term memory mechanisms in neural network learning of robot navigation tasks: A case study. 2009 6th Latin American Robotics Symposium, LARS 2009, (4). https://doi.org/10.1109/LARS.2009.5418323 Hastie, T. J., Tibshirani, R. J., & Friedman, J. H. (2009). The elements of statistical learning: data mining, inference, and prediction. book, New York: Springer. MetraLabs GmbH. (2011). SCITOS G5 Embedded PC and Operating System. Retrieved December 14, 2016, from www.metralabs.com Otte, M. W. (2015). A Survey of Machine Learning Approaches to Robotic Path-Planning. Cs.Colorado.Edu. Retrieved from http://www.cs.colorado.edu/~mozer/Teaching/Computational Modeling Prelim/Otte.pdf

37


Descargar ppt "Presented by: Rafael O. Batista Jorge Wall Following Robot Path Planning Problem: A Statistical Learning Approach Universidad de Puerto Rico Recinto Universitario."

Presentaciones similares


Anuncios Google