Friday, 22 April 2016

Integrating the Concept of Meta-heuristics to Neural Networks

Exploring the topic of Meta-heuristics (see article here) further, we can structure ways to implement them in neural network structure which is useful for solving certain problems. 

When using metaheuristic algorithms, we define the search space the algorithm works within based on constraints, i.e. we enframe the individual signalling nodes in a configuration space.

We define a constrained metaheuristic in terms of a network structure. The constraints themselves are really imposed by the weights of the inputs, W i,j.

For a given input vector: 

In a neural network structure we weight the inputs and sum them through k-number of hidden layers in the following summation:

Our activation function, A(x), which passes the weighted values through the k-number of layers is then in the same form as the signalling basis function for a metaheuristic algorithm, which is based on the Gaussian form of the signalling intensity which is monotonically decreasing.

The activation then takes the form:

Which is a sigmoidal activation function

This is exactly the same form of equation as our meta-heuristic activation function as discussed in the previous chapter in this study.

Hence the “hidden” layers of the neural network will be a summation of all possible weights the inputs take across the sigmoidal activation function in the neural network structure.

The output is then a finite discrete integer response, which could be represented on a digital number line, characteristic of the initial input vector, that has been effectively broadened by the k-layered neural network:

Metaheuristic signalling processes are then reduced simply to the weights the input vector acts on in the neural network picture:

I represent this construct in the diagram below, in which I have created a Radial Basis Singal Function that passes through a neural network in Matlab which receives a signal and creates an output response characteristic of this input, broadened by the k-number of paths of sigmoidal activation function signalling within the network:

Non-stochastic processes can be simulated quite well by neural networks in this fashion. Moreover, using meta-heuristics to design specific power law-based signalling activation functions we can structure simulations using neural networks in real-world design applications. 

It also turns out that one can expand on this mathematical reductionist method on the theories of metaheuristics and neural networks to join them together even further using a path integral interpretation of machine signalling which can in principle provide a somewhat more fundamental view of how self-organizing synchronization can occur in certain networks that have not been explicitly programmed.

No comments:

Post a Comment