Skip to content

17th June webinar – José Principe

Zoom link https://videoconf-colibri.zoom.us/j/98321826530
Add to Google Calendar

Abstract
This talk presents an overview of how information theoretic (IT) concepts and algorithms can be applied to explain the mappings learned in machine learning. The first step is to select an approach to estimate directly from data entropy and mutual information because in machine learning the pdf of the data is normally unknown. Here we will show how Renyi’s entropy and mutual information can be estimated from the eigenspectrum of the Gram matrix of kernel learning. Here we will briefly explain how IT can be used to analyze the dynamics of learning and set up proper topologies in deep learning.

Short Bio
José C. Principe (M’83-SM’90-F’00) is a Distinguished Professor of Electrical and Computer Engineering and Biomedical Engineering at the University of Florida where he teaches advanced signal processing, machine learning and artificial neural networks (ANNs) modeling. He is the Eckis Professor and the Founder and Director of the University of Florida Computational NeuroEngineering Laboratory (CNEL) www.cnel.ufl.edu . His primary area of interest is processing of time varying signals with adaptive neural models. The CNEL Lab has been studying signal and pattern recognition principles based on information theoretic criteria (entropy and mutual information).
Dr. Principe is an IEEE Fellow. He was the past Chair of the Technical Committee on Neural Networks of the IEEE Signal Processing Society, Past-President of the International Neural Network Society, and Past-Editor in Chief of the IEEE Transactions on Biomedical Engineering. He is a member of the Advisory Board of the University of Florida Brain Institute. Dr. Principe has more than 1,000 publications and an H-index of 101. He directed 108 Ph.D. dissertations and 65 Master theses. He wrote in 2000 an interactive electronic book entitled “Neural and Adaptive Systems” published by John Wiley and Sons and more recently co-authored several books on “Brain Machine Interface Engineering” Morgan and Claypool, “Information Theoretic Learning”, Springer, and “Kernel Adaptive Filtering”, Wiley.

Information at FEP News