total 6746took 0.11s

Evaluating Defensive Distillation For Defending Text Processing Neural Networks Against Adversarial ExamplesAug 21 2019Adversarial examples are artificially modified input samples which lead to misclassifications, while not being detectable by humans. These adversarial examples are a challenge for many tasks such as image and text classification, especially as research ... More

Enabling hyperparameter optimization in sequential autoencoders for spiking neural dataAug 21 2019Aug 22 2019Continuing advances in neural interfaces have enabled simultaneous monitoring of spiking activity from hundreds to thousands of neurons. To interpret these large-scale data, several methods have been proposed to infer latent dynamic structure from high-dimensional ... More

Enabling hyperparameter optimization in sequential autoencoders for spiking neural dataAug 21 2019Continuing advances in neural interfaces have enabled simultaneous monitoring of spiking activity from hundreds to thousands of neurons. To interpret these large-scale data, several methods have been proposed to infer latent dynamic structure from high-dimensional ... More

Landmark Map: An Extension of the Self-Organizing Map for a User-Intended Nonlinear ProjectionAug 20 2019The self-organizing map (SOM) is an unsupervised artificial neural network that is widely used in, e.g., data mining and visualization. Supervised and semi-supervised learning methods have been proposed for the SOM. However, their teacher labels do not ... More

Modeling Major Transitions in Evolution with the Game of LifeAug 19 2019Maynard Smith and Szathm\'ary's book, The Major Transitions in Evolution, describes eight major events in the evolution of life on Earth and identifies a common theme that unites these events. In each event, smaller entities came together to form larger ... More

Architecture Search by Estimation of Network Structure DistributionsAug 19 2019The influence of deep learning is continuously expanding across different domains, and its new applications are ubiquitous. The question of neural network design thus increases in importance, as traditional empirical approaches are reaching their limits. ... More

The Runtime of the Compact Genetic Algorithm on Jump FunctionsAug 18 2019In the first and so far only mathematical runtime analysis of an estimation-of-distribution algorithm (EDA) on a multimodal problem, Hasen\"ohrl and Sutton (GECCO 2018) showed for any $k = o(n)$ that the compact genetic algorithm (cGA) with any hypothetical ... More

Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural NetworksAug 18 2019Spiking neural networks (SNNs) are more biologically plausible than conventional artificial neural networks (ANNs). SNNs well support spatiotemporal learning and energy-efficient event-driven hardware neuromorphic processors. As an important class of ... More

VUSFA:Variational Universal Successor Features Approximator to Improve Transfer DRL for Target Driven Visual NavigationAug 18 2019In this paper, we show how novel transfer reinforcement learning techniques can be applied to the complex task of target driven navigation using the photorealistic AI2THOR simulator. Specifically, we build on the concept of Universal Successor Features ... More

Anomaly Detection in Video Sequence with Appearance-Motion CorrespondenceAug 17 2019Anomaly detection in surveillance videos is currently a challenge because of the diversity of possible events. We propose a deep convolutional neural network (CNN) that addresses this problem by learning a correspondence between common object appearances ... More

Hybrid Deep Network for Anomaly DetectionAug 17 2019In this paper, we propose a deep convolutional neural network (CNN) for anomaly detection in surveillance videos. The model is adapted from a typical auto-encoder working on video patches under the perspective of sparse combination learning. Our CNN focuses ... More

Structural Health Monitoring of Cantilever Beam, a Case Study -- Using Bayesian Neural Network AND Deep LearningAug 17 2019The advancement of machine learning algorithms has opened a wide scope for vibration-based SHM (Structural Health Monitoring). Vibration-based SHM is based on the fact that damage will alter the dynamic properties viz., structural response, frequencies, ... More

Multi-Objective Evolutionary Framework for Non-linear System Identification: A Comprehensive InvestigationAug 17 2019The present study proposes a multi-objective framework for structure selection of nonlinear systems which are represented by polynomial NARX models. This framework integrates the key components of Multi-Criteria Decision Making (MCDM) which include preference ... More

Evolutionary Computation, Optimization and Learning Algorithms for Data ScienceAug 16 2019A large number of engineering, science and computational problems have yet to be solved in a computationally efficient way. One of the emerging challenges is how evolving technologies grow towards autonomy and intelligent decision making. This leads to ... More

Survey on Deep Neural Networks in Speech and Vision SystemsAug 16 2019This survey presents a review of state-of-the-art deep neural network architectures, algorithms, and systems in vision and speech applications. Recent advances in deep artificial neural network algorithms and architectures have spurred rapid innovation ... More

Performing Deep Recurrent Double Q-Learning for Atari GamesAug 16 2019Currently, many applications in Machine Learning are based on define new models to extract more information about data, In this case Deep Reinforcement Learning with the most common application in video games like Atari, Mario, and others causes an impact ... More

The Partial Response NetworkAug 16 2019We propose a method to open the black box of the Multi-Layer Perceptron by inferring from it a simpler and generally more accurate general additive model. The resulting model comprises non-linear univariate and bivariate partial responses derived from ... More

Matrix Lie Maps and Neural Networks for Solving Differential EquationsAug 16 2019The coincidence between polynomial neural networks and matrix Lie maps is discussed in the article. The matrix form of Lie transform is an approximation of the general solution of the nonlinear system of ordinary differential equations. It can be used ... More

Differentiable Learning-to-Group Channels via Groupable Convolutional Neural NetworksAug 16 2019Aug 19 2019Group convolution, which divides the channels of ConvNets into groups, has achieved impressive improvement over the regular convolution operation. However, existing models, eg. ResNeXt, still suffers from the sub-optimal performance due to manually defining ... More

Differentiable Learning-to-Group Channels viaGroupable Convolutional Neural NetworksAug 16 2019Group convolution, which divides the channels of ConvNets into groups, has achieved impressive improvement over the regular convolution operation. However, existing models, eg. ResNeXt, still suffers from the sub-optimal performance due to manually defining ... More

Generating Random Parameters in Feedforward Neural Networks with Random Hidden Nodes: Drawbacks of the Standard Method and How to Improve ItAug 16 2019The standard method of generating random weights and biases in feedforward neural networks with random hidden nodes, selects them both from the uniform distribution over the same fixed interval. In this work, we show the drawbacks of this approach and ... More

Automatic Compiler Based FPGA Accelerator for CNN TrainingAug 15 2019Training of convolutional neural networks (CNNs)on embedded platforms to support on-device learning is earning vital importance in recent days. Designing flexible training hard-ware is much more challenging than inference hardware, due to design complexity ... More

Improving Randomized Learning of Feedforward Neural Networks by Appropriate Generation of Random ParametersAug 15 2019In this work, a method of random parameters generation for randomized learning of a single-hidden-layer feedforward neural network is proposed. The method firstly, randomly selects the slope angles of the hidden neurons activation functions from an interval ... More

MOEA/D with Uniformly Randomly Adaptive WeightsAug 15 2019When working with decomposition-based algorithms, an appropriate set of weights might improve quality of the final solution. A set of uniformly distributed weights usually leads to well-distributed solutions on a Pareto front. However, there are two main ... More

Neural Network Predictive Controller for Grid-Connected Virtual Synchronous GeneratorAug 14 2019In this paper, a neural network predictive controller is proposed to regulate the active and the reactive power delivered to the grid generated by a three-phase virtual inertia-based inverter. The concept of the conventional virtual synchronous generator ... More

Heuristic Dynamic Programming for Adaptive Virtual Synchronous GeneratorsAug 14 2019In this paper a neural network heuristic dynamic programing (HDP) is used for optimal control of the virtual inertia based control of grid connected three phase inverters. It is shown that the conventional virtual inertia controllers are not suited for ... More

Dual Heuristic Dynamic Programing Control of Grid-Connected SynchronvertersAug 14 2019In this paper a new approach to control a grid-connected synchronverter by using a dual heuristic dynamic programing (DHP) design is presented. The disadvantages of conventional synchronverter controller such as the challenges to cope with nonlinearity, ... More

Unconstrained Monotonic Neural NetworksAug 14 2019Monotonic neural networks have recently been proposed as a way to define invertible transformations. These transformations can be combined into powerful autoregressive flows that have been shown to be universal approximators of continuous probability ... More

Constrained Multi-Objective Optimization for Automated Machine LearningAug 14 2019Automated machine learning has gained a lot of attention recently. Building and selecting the right machine learning models is often a multi-objective optimization problem. General purpose machine learning software that simultaneously supports multiple ... More

Monthly electricity consumption forecasting by the fruit fly optimization algorithm enhanced Holt-Winters smoothing methodAug 14 2019The electricity consumption forecasting is a critical component of the intelligent power system. And accurate monthly electricity consumption forecasting, as one of the the medium and long term electricity consumption forecasting problems, plays an important ... More

A Deep Evolutionary Approach to Bioinspired Classifier Optimisation for Brain-Machine InteractionAug 13 2019This study suggests a new approach to EEG data classification by exploring the idea of using evolutionary computation to both select useful discriminative EEG features and optimise the topology of Artificial Neural Networks. An evolutionary algorithm ... More

Bayesian automated posterior repartitioning for nested samplingAug 13 2019Priors in Bayesian analyses often encode informative domain knowledge that can be useful in making the inference process more efficient. Occasionally, however, priors may be unrepresentative of the parameter values for a given dataset, which can result ... More

Adversarial Neural PruningAug 12 2019It is well known that neural networks are susceptible to adversarial perturbations and are also computationally and memory intensive which makes it difficult to deploy them in real-world applications where security and computation are constrained. In ... More

Implementing Binarized Neural Networks with Magnetoresistive RAM without Error CorrectionAug 12 2019One of the most exciting applications of Spin Torque Magnetoresistive Random Access Memory (ST-MRAM) is the in-memory implementation of deep neural networks, which could allow improving the energy efficiency of Artificial Intelligence by orders of magnitude ... More

ACNet: Strengthening the Kernel Skeletons for Powerful CNN via Asymmetric Convolution BlocksAug 11 2019As designing appropriate Convolutional Neural Network (CNN) architecture in the context of a given application usually involves heavy human works or numerous GPU hours, the research community is soliciting the architecture-neutral CNN structures, which ... More

Data-Driven Randomized Learning of Feedforward Neural NetworksAug 11 2019Randomized methods of neural network learning suffer from a problem with the generation of random parameters as they are difficult to set optimally to obtain a good projection space. The standard method draws the parameters from a fixed interval which ... More

One-time learning and reverse salience signal with a salience affected neural network (SANN)Aug 09 2019Standard artificial neural networks model key cognitive aspects of brain function, such as learning and classification, but they do not model the affective (emotional) aspects; however primary and secondary emotions play a key role in interactions with ... More

One-time learning and reverse salience signal with a salience affected neural network (SANN)Aug 09 2019Aug 12 2019Standard artificial neural networks model key cognitive aspects of brain function, such as learning and classification, but they do not model the affective (emotional) aspects; however primary and secondary emotions play a key role in interactions with ... More

Optimizing quantum heuristics with meta-learningAug 08 2019Variational quantum algorithms, a class of quantum heuristics, are promising candidates for the demonstration of useful quantum computation. Finding the best way to amplify the performance of these methods on hardware is an important task. Here, we evaluate ... More

TensorDIMM: A Practical Near-Memory Processing Architecture for Embeddings and Tensor Operations in Deep LearningAug 08 2019Recent studies from several hyperscalars pinpoint to embedding layers as the most memory-intensive deep learning (DL) algorithm being deployed in today's datacenters. This paper addresses the memory capacity and bandwidth challenges of embedding layers ... More

One Model To Rule Them AllAug 08 2019We present a new flavor of Variational Autoencoder (VAE) that interpolates seamlessly between unsupervised, semi-supervised and fully supervised learning domains. We show that unlabeled datapoints not only boost unsupervised tasks, but also the classification ... More

A Multimodal Deep Network for the Reconstruction of T2W MR ImagesAug 08 2019Multiple sclerosis is one of the most common chronic neurological diseases affecting the central nervous system. Lesions produced by the MS can be observed through two modalities of magnetic resonance (MR), known as T2W and FLAIR sequences, both providing ... More

Benchmarking Surrogate-Assisted Genetic Recommender SystemsAug 08 2019We propose a new approach for building recommender systems by adapting surrogate-assisted interactive genetic algorithms. A pool of user-evaluated items is used to construct an approximative model which serves as a surrogate fitness function in a genetic ... More

Visualizing the PHATE of Neural NetworksAug 07 2019Understanding why and how certain neural networks outperform others is key to guiding future development of network architectures and optimization methods. To this end, we introduce a novel visualization algorithm that reveals the internal geometry of ... More

Self-Organizing Maps with Variable Input Length for Motif Discovery and Word SegmentationAug 07 2019Time Series Motif Discovery (TSMD) is defined as searching for patterns that are previously unknown and appear with a given frequency in time series. Another problem strongly related with TSMD is Word Segmentation. This problem has received much attention ... More

3D-aCortex: An Ultra-Compact Energy-Efficient Neurocomputing Platform Based on Commercial 3D-NAND Flash MemoriesAug 07 2019The first contribution of this paper is the development of extremely dense, energy-efficient mixed-signal vector-by-matrix-multiplication (VMM) circuits based on the existing 3D-NAND flash memory blocks, without any need for their modification. Such compatibility ... More

Cheetah: Mixed Low-Precision Hardware & Software Co-Design Framework for DNNs on the EdgeAug 06 2019Low-precision DNNs have been extensively explored in order to reduce the size of DNN models for edge devices. Recently, the posit numerical format has shown promise for DNN data representation and compute with ultra-low precision in [5..8]-bits. However, ... More

Refactoring Neural Networks for VerificationAug 06 2019Deep neural networks (DNN) are growing in capability and applicability. Their effectiveness has led to their use in safety critical and autonomous systems, yet there is a dearth of cost-effective methods available for reasoning about the behavior of a ... More

Random Directional Attack for Fooling Deep Neural NetworksAug 06 2019Deep neural networks (DNNs) have been widely used in many fields such as images processing, speech recognition; however, they are vulnerable to adversarial examples, and this is a security issue worthy of attention. Because the training process of DNNs ... More

Neuroscience-inspired online unsupervised learning algorithmsAug 05 2019Although the currently popular deep learning networks achieve unprecedented performance on some tasks, the human brain still has a monopoly on general intelligence. Motivated by this and biological implausibility of deep learning networks, we developed ... More

Unsupervised Representations of Pollen in Bright-Field MicroscopyAug 05 2019We present the first unsupervised deep learning method for pollen analysis using bright-field microscopy. Using a modest dataset of 650 images of pollen grains collected from honey, we achieve family level identification of pollen. We embed images of ... More

Gradient Descent Finds Global Minima for Generalizable Deep Neural Networks of Practical SizesAug 05 2019In this paper, we theoretically prove that gradient descent can find a global minimum for nonlinear deep neural networks of sizes commonly encountered in practice. The theory developed in this paper requires only the number of trainable parameters to ... More

Sample size calculations for the experimental comparison of multiple algorithms on multiple problem instancesAug 05 2019This work presents a statistically principled method for estimating the required number of instances in the experimental comparison of multiple algorithms on a given problem class of interest. This approach generalises earlier results by allowing researchers ... More

Construction of Macro Actions for Deep Reinforcement LearningAug 05 2019Conventional deep reinforcement learning typically determines an appropriate primitive action at each timestep, which requires enormous amount of time and effort for learning an effective policy, especially in large and complex environments. To deal with ... More

MoGA: Searching Beyond MobileNetV3Aug 04 2019The evolution of MobileNets has laid a solid foundation for neural network application on the mobile end. With the latest MobileNetV3, neural architecture search again claimed its supremacy on network design. Till today all mobile methods mainly focus ... More

Hopfield Neural Network Flow: A Geometric ViewpointAug 04 2019We provide gradient flow interpretations for the continuous-time continuous-state Hopfield neural network (HNN). The ordinary and stochastic differential equations associated with the HNN were introduced in the literature as analog optimizers, and were ... More

Multi-node environment strategy for Parallel Deterministic Multi-Objective Fractal DecompositionAug 04 2019This paper presents a new implementation of deterministic multiobjective (MO) optimization called Multiobjective Fractal Decomposition Algorithm (Mo-FDA). The original algorithm was designed for mono-objective large scale continuous optimization problems. ... More

Sentiment Analysis of Typhoon Related Tweets using Standard and Bidirectional Recurrent Neural NetworksAug 03 2019The Philippines is a common ground to natural calamities like typhoons, floods, volcanic eruptions and earthquakes. With Twitter as one of the most used social media platform in the Philippines, a total of 39,867 preprocessed tweets were obtained given ... More

Weight Friction: A Simple Method to Overcome Catastrophic Forgetting and Enable Continual LearningAug 02 2019Aug 17 2019In recent years, deep neural networks have found success in replicating human-level cognitive skills, yet they suffer from several major obstacles. One significant limitation is the inability to learn new tasks without forgetting previously learned tasks, ... More

Weight Friction: A Simple Method to Overcome Catastrophic Forgetting and Enable Continual LearningAug 02 2019In recent years, deep neural networks have found success in replicating human-level cognitive skills, yet they suffer from several major obstacles. One significant limitation is the inability to learn new tasks without forgetting previously learned tasks, ... More

Merging variables: one technique of search in pseudo-Boolean optimizationAug 02 2019In the present paper we describe new heuristic technique, which can be applied to the optimization of pseudo-Boolean functions including Black-Box functions. This technique is based on a simple procedure which consists in transition from the optimization ... More

Conditional Finite Mixtures of Poisson Distributions for Context-Dependent Neural CorrelationsAug 01 2019Parallel recordings of neural spike counts have revealed the existence of context-dependent noise correlations in neural populations. Theories of population coding have also shown that such correlations can impact the information encoded by neural populations ... More

A self-organizing fuzzy neural network for sequence learningAug 01 2019In this paper, a new self-organizing fuzzy neural network model is presented which is able to learn and reproduce different sequences accurately. Sequence learning is important in performing skillful tasks, such as writing and playing piano. The structure ... More

Estimation of Tire-Road Friction for Autonomous Vehicles: a Neural Network ApproachAug 01 2019The performance of vehicle active safety systems is dependent on the friction force arising from the contact of tires and the road surface. Therefore, an adequate knowledge of the tire-road friction coefficient is of great importance to achieve a good ... More

Popt4jlib: A Parallel/Distributed Optimization Library for JavaAug 01 2019This paper describes the architectural design as well as key implementation details of the Open Source popt4jlib library (https://githhub.org/ioannischristou/popt4jlib) that contains a fairly large number of meta-heuristic and other exact optimization ... More

LoadCNN: A Efficient Green Deep Learning Model for Day-ahead Individual Resident Load ForecastingAug 01 2019Accurate day-ahead individual resident load forecasting is very important to various applications of smart grid. As a powerful machine learning technology, deep learning has shown great advantages in load forecasting task. However, deep learning is a ... More

Multi-Scale Learned Iterative ReconstructionAug 01 2019Model-based learned iterative reconstruction methods have recently been shown to outperform classical reconstruction methods. Applicability of these methods to large scale inverse problems is however limited by the available memory for training and extensive ... More

Biologically inspired sleep algorithm for artificial neural networksAug 01 2019Sleep plays an important role in incremental learning and consolidation of memories in biological systems. Motivated by the processes that are known to be involved in sleep generation in biological networks, we developed an algorithm that implements a ... More

Graph Neural Networks for Small Graph and Giant Network Representation Learning: An OverviewAug 01 2019Graph neural networks denote a group of neural network models introduced for the representation learning tasks on graph data specifically. Graph neural networks have been demonstrated to be effective for capturing network structure information, and the ... More

Competitive Co-evolution for Dynamic Constrained OptimisationJul 31 2019Dynamic constrained optimisation problems (DCOPs) widely exist in the real world due to frequently changing factors influencing the environment. Many dynamic optimisation methods such as diversity-driven methods, memory and prediction methods offer different ... More

Evolutionary Dataset Optimisation: learning algorithm quality through evolutionJul 31 2019In this paper we propose a new method for learning how algorithms perform. Classically, algorithms are compared on a finite number of existing (or newly simulated) benchmark data sets based on some fixed metrics. The algorithm(s) with the smallest value ... More

On the difficulty of learning and predicting the long-term dynamics of bouncing objectsJul 31 2019The ability to accurately predict the surrounding environment is a foundational principle of intelligence in biological and artificial agents. In recent years, a variety of approaches have been proposed for learning to predict the physical dynamics of ... More

Neural networks-based backward scheme for fully nonlinear PDEsJul 31 2019We propose a numerical method for solving high dimensional fully nonlinear partial differential equations (PDEs). Our algorithm estimates simultaneously by backward time induction the solution and its gradient by multi-layer neural networks, through a ... More

Temporal coding in spiking neural networks with alpha synaptic functionJul 30 2019The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose ... More

Ablate, Variate, and Contemplate: Visual Analytics for Discovering Neural ArchitecturesJul 30 2019Deep learning models require the configuration of many layers and parameters in order to get good results. However, there are currently few systematic guidelines for how to configure a successful model. This means model builders often have to experiment ... More

Deep learning research landscape & roadmap in a nutshell: past, present and future -- Towards deep cortical learningJul 30 2019The past, present and future of deep learning is presented in this work. Given this landscape & roadmap, we predict that deep cortical learning will be the convergence of deep learning & cortical learning which builds an artificial cortical column ultimately. ... More

GENESIS: Generative Scene Inference and Sampling with Object-Centric Latent RepresentationsJul 30 2019Generative models are emerging as promising tools in robotics and reinforcement learning. Yet, even though tasks in these domains typically involve distinct objects, most state-of-the-art methods do not explicitly capture the compositional nature of visual ... More

Incremental Bounded Model Checking of Artificial Neural Networks in CUDAJul 30 2019Artificial Neural networks (ANNs) are powerful computing systems employed for various applications due to their versatility to generalize and to respond to unexpected inputs/patterns. However, implementations of ANNs for safety-critical systems might ... More

Exponential Slowdown for Larger Populations: The $(μ+1)$-EA on Monotone FunctionsJul 30 2019Pseudo-Boolean monotone functions are unimodal functions which are trivial to optimize for some hillclimbers, but are challenging for a surprising number of evolutionary algorithms (EAs). A general trend is that EAs are efficient if parameters like the ... More

An artifcial life approach to studying niche differentiation in soundscape ecologyJul 30 2019Artificial life simulations are an important tool in the study of ecological phenomena that can be difficult to examine directly in natural environments. Recent work has established the soundscape as an ecologically important resource and it has been ... More

EVO* 2019 -- Late-Breaking Abstracts VolumeJul 30 2019This volume contains the Late-Breaking Abstracts submitted to the EVO* 2019 Conference, that took place in Leipzig, from 24 to 26 of April. These papers where presented as short talks and also at the poster session of the conference together with other ... More

Particle Swarm Optimisation for Evolving Deep Neural Networks for Image Classification by Evolving and Stacking Transferable BlocksJul 29 2019Deep Convolutional Neural Networks (CNNs) have been widely used in image classification tasks, but the process of designing CNN architectures is very complex, so Neural Architecture Search (NAS), automatically searching for optimal CNN architectures, ... More

MineRL: A Large-Scale Dataset of Minecraft DemonstrationsJul 29 2019The sample inefficiency of standard deep reinforcement learning methods precludes their application to many real-world problems. Methods which leverage human demonstrations require fewer samples but have been researched less. As demonstrated in the computer ... More

RNNbow: Visualizing Learning via Backpropagation Gradients in Recurrent Neural NetworksJul 29 2019We present RNNbow, an interactive tool for visualizing the gradient flow during backpropagation training in recurrent neural networks. RNNbow is a web application that displays the relative gradient contributions from Recurrent Neural Network (RNN) cells ... More

On the Limitations of the Univariate Marginal Distribution Algorithm to Deception and Where Bivariate EDAs might helpJul 29 2019We introduce a new benchmark problem called Deceptive Leading Blocks (DLB) to rigorously study the runtime of the Univariate Marginal Distribution Algorithm (UMDA) in the presence of epistasis and deception. We show that simple Evolutionary Algorithms ... More

Modulation of early visual processing alleviates capacity limits in solving multiple tasksJul 29 2019Jul 30 2019In daily life situations, we have to perform multiple tasks given a visual stimulus, which requires task-relevant information to be transmitted through our visual system. When it is not possible to transmit all the possibly relevant information to higher ... More

Modulation of early visual processing alleviates capacity limits \\in solving multiple tasksJul 29 2019In daily life situations, we have to perform multiple tasks given a visual stimulus, which requires task-relevant information to be transmitted through our visual system. When it is not possible to transmit all the possibly relevant information to higher ... More

Adaptive spline fitting with particle swarm optimizationJul 28 2019In fitting data with a spline, finding the optimal placement of knots can significantly improve the quality of the fit. However, the challenging high-dimensional and non-convex optimization problem associated with completely free knot placement has been ... More

Adaptive spline fitting with particle swarm optimizationJul 28 2019Jul 30 2019In fitting data with a spline, finding the optimal placement of knots can significantly improve the quality of the fit. However, the challenging high-dimensional and non-convex optimization problem associated with completely free knot placement has been ... More

On the Robustness of Median Sampling in Noisy Evolutionary OptimizationJul 28 2019In real-world optimization tasks, the objective (i.e., fitness) function evaluation is often disturbed by noise due to a wide range of uncertainties. Evolutionary algorithms (EAs) have been widely applied to tackle noisy optimization, where reducing the ... More

Spatiotemporal Information Processing with a Reservoir Decision-making NetworkJul 28 2019Spatiotemporal information processing is fundamental to brain functions. The present study investigates a canonic neural network model for spatiotemporal pattern recognition. Specifically, the model consists of two modules, a reservoir subnetwork and ... More

SOM-Guided Evolutionary Search for Solving MinMax Multiple-TSPJul 27 2019Multiple-TSP, also abbreviated in the literature as mTSP, is an extension of the Traveling Salesman Problem that lies at the core of many variants of the Vehicle Routing problem of great practical importance. The current paper develops and experiments ... More

Training products of expert capsules with mixing by dynamic routingJul 26 2019This study develops an unsupervised learning algorithm for products of expert capsules with dynamic routing. Analogous to binary-valued neurons in Restricted Boltzmann Machines, the magnitude of a squashed capsule firing takes values between zero and ... More

Training capsules as a routing-weighted product of expert neuronsJul 26 2019Capsules are the multidimensional analogue to scalar neurons in neural networks, and because they are multidimensional, much more complex routing schemes can be used to pass information forward through the network than what can be used in traditional ... More

Autoencoding with a Learning Classifier System: Initial ResultsJul 26 2019Jul 29 2019Autoencoders enable data dimensionality reduction and a key component of many (deep) learning systems. This short paper introduces a form of Holland's Learning Classifier System (LCS) to perform autoencoding building upon a previously presented form of ... More

Semisupervised Adversarial Neural Networks for Cyber Security Transfer LearningJul 25 2019On the path to establishing a global cybersecurity framework where each enterprise shares information about malicious behavior, an important question arises. How can a machine learning representation characterizing a cyber attack on one network be used ... More

Graph Neural Lasso for Dynamic Network RegressionJul 25 2019In this paper, we will study the dynamic network regression problem, which focuses on inferring both individual entities' changing attribute values and the dynamic relationships among the entities in the network data simultaneously. To resolve the problem, ... More

Benchmarking HillVallEA for the GECCO 2019 Competition on Multimodal OptimizationJul 25 2019This report presents benchmarking results of the Hill-Valley Evolutionary Algorithm version 2019 (HillVallEA19) on the CEC2013 niching benchmark suite under the restrictions of the GECCO 2019 niching competition on multimodal optimization. Performance ... More

Knowledge transfer in deep block-modular neural networksJul 24 2019Although deep neural networks (DNNs) have demonstrated impressive results during the last decade, they remain highly specialized tools, which are trained -- often from scratch -- to solve each particular task. The human brain, in contrast, significantly ... More

A Fine-Grained Spectral Perspective on Neural NetworksJul 24 2019Are neural networks biased toward simple functions? Does depth always help learn more complex features? Is training the last layer of a network as good as training all layers? These questions seem unrelated at face value, but in this work we give all ... More