Views 
   PDF Download PDF Downloads: 851

 Open Access -   Download full article: 

Boolean Models Guide Intentionally Continuous Information and Computation Inside the Brain

Germano Resconi

Mathematical and Physical Department, Catholic University, Via Trieste 17 Brescia

Corresponding Author’s E-mail: resconi42@gmail.com

DOI : http://dx.doi.org/10.13005/ojcst12.03.03

Article Publishing History
Article Received on : 15/06/2019
Article Accepted on : 11/09/2019
Article Published : 13 Sep 2019
Article Metrics
ABSTRACT:

In 1943 Machculloch and Pitts create the formal neuron where many input signals are linearly composed with different weights on the neuron soma. When the soma electrical signal goes over a specific threshold an output is produced. The main topic in this model is that the response is the same response as in a Boolean function used a lot for the digital computer. Logic functions can be simplified with the formal neuron. But there is the big problem for which not all logic functions, as XOR , cannot be designed in the formal neuron. After a long time the back propagation and many other neural models overcame the big problem in some cases but not in all cases creating a lot of uncertainty. The model proposed does not consider the formal neuron but the natural network controlled by a set of differential equations for neural channels that model the current and voltage on the neuron surface. The steady state of the probabilities is the activation state continuous function whose maximum and minimum are the values of the Boolean function associated with the activation time of spikes of the neuron. With this method the activation function can be designed when the Boolean functions are known. Moreover the neuron differential equation can be designed in order to realize the wanted Boolean function in the neuron itself. The activation function theory permits to compute the neural parameters in agreement with the intention.

KEYWORDS: Activation Function; Boolean Function; Digital and Continuous Computation; Differential Equation for Neuron Channel; Intention Implemented into the Brain Parameters; Natural Neuron

Copy the following to cite this article:

Resconi G. Boolean Models Guide Intentionally Continuous Information and Computation Inside the Brain. Orient. J. Comp. Sci. and Technol;12(3).


Copy the following to cite this URL:

Resconi G. Boolean Models Guide Intentionally Continuous Information and Computation Inside the Brain. Orient. J. Comp. Sci. and Technol;12(3). Available from: https://bit.ly/2lZSmdy


Introduction

In this paper we present a method to transform every Boolean function into one dimension continuous function denoted implication Boolean function or activation function. The values of the Boolean functions are represented by the maximum value as one (true) and the minimum value as zero (false); between true and false there are all possible degrees of truth. Channels in one neuron with electrical ionic can be modelled to realize the wanted implication Boolean function. Now we can transmit this function in space time to other neurons by a spike transformation of the implication Boolean function. Synaptic degrees of the superposition of the spikes can be computed by a special code to generate the wanted function. With this code every Boolean function can be generated by other input Boolean functions. We can use multiple channel network to obtain any type of complex Boolean functions in the steady state condition. We remark that discrete and continuous logic can be implemented in the brain to represent the intention of the agent in a physical way.

Neural Solution of Boolean contradictory function by dependent function k’(x) [5,6,7,8]

Graph of the dependent function k’(x) for AND

Figure 1

Figure 1: AND operation by one dependent function in (1)

Click here to View figure

Graph

Figure 2

Figure 2: Negation of the AND operation by dependent function in (2)

Click here to View figure

Graph

Figure 3

Figure 3: Negation of the AND operation by dependent function in (3)

Click here to View figure

Boolean Implication function ( active function )


Figure 4

Figure 4: Negation of the AND operation by dependent function in (4)

Click here to View figure

We remark that for this Boolean function k(x) has a singular point. Now because for the original dependent function, we have the scheme

We can change the form of k(x) with the same original properties but without the negative value of k(x) and also without the singularity. So we have

Now we have


Figure 5

Figure 5: Implication dependent function k(x) (7)

Click here to View figure

3. Nonlinear neuron and dependent function k’(x)

For the Boolean function and dependent function we can create this scheme for a nonlinear neuron

Figure 6

Figure 6: dependent function and nonlinear neural network

Click here to View figure

4.Machine and systems by nonlinear neuron

Given the machine with x the input, q the states and y the output

Graphic image of the Boolean system for the first equation by elementary Boolean functions AND, OR, and NOT.

Figure 7

Figure 7: System by classical gate operators as AND, OR, NOT

Click here to View figure

For the input x = 0 we have the state input transition

Nonlinear network

Figure 8

Figure 8: Nonlinear network for the system.

Click here to View figure

5 Natural Neural Network [9,10,11,12,13].

Figure 9

Figure 9: Neuron S dendrite as input and axonal arborisation structure

Click here to View figure

The information in the neuron is mediated by the channels on the surface or membrane as we can see in figure 18

Figure 10

Figure 10: Neural channels inside the neuron

Click here to View figure

We know that each channel in figure 10 is a graduate or fuzzy switch that depends on different elements that can open or close the channels by active process or passive process. In figure 10 we can see the channel mutual dependences that can change the electrical charges in the membrane to generate activation potential or other complex changes in the membrane electrical potential Each channel controls the inside and outside movement of the ions as K+ or Na+.

Mathematical representation of the spike process by the Hodgkin-Huxley model

The Hodgkin-Huxley model assumes that the electrical activity of the squid giant axon is mainly due to the movement of Na+ and K+ ions across the membrane. Thus, in the model, the neuronal membrane contains Na+ channels, K+ channels, and a leakage channel through which various other ionic species, such as chloride Cl−, can pass. The equivalent circuit diagram corresponding to the Hodgkin-Huxley model is shown in Figure 19

Figure 11

Figure 11: Circuit Diagram for the Hodgkin-Huxley Model of the Squid Giant Axon.

Click here to View figure

Hodgkin, Huxley neuron electronic [13] image by the equation

Where the rate constant b is the activation rate constant for which the ionic channel from closed state becomes open and the ion moves from internal to external part of the neuron. The rate a, b are coupled with electronic system of the neuron by the voltages on the membrane. Channel process is given by the following system (figure 12) ba

Figure 12

Figure 12: model of the ion movement from internal to external of the cell by the voltages control

Click here to View figure

Figure 13

Figure 13: image of input output channels dynamics

Click here to View figure

Boolean function and activation function by electronic systems and channels

Each channel controls the inside and outside movement of the ions as K+ or Na+. The Channel can be represented in a very simple way by the graph in figure (12) that we show in this system

Figure 14

Figure 14: simple system of outside inside channel

Click here to View figure

The system is represented by the differential equation

Figure 15

Figure 15: electrical image of the channel input current

Click here to View figure

For

We have the input voltage RI given by the function

Figure 16

Figure 16: input voltages by implicit Boolean function (activation function) as steady value of S1

Click here to View figure

The input voltages are equal to the implicit Boolean function, The input voltages are the steady value S1. For S2 we have the function

Figure 17

Figure 17: implicit Boolean function for S2 = 1-S1

Click here to View figure

We remark that

In conclusion we can see that the Boolean functions x and NOT x that we write in this way

Are translated into the two implicit Boolean functions (activation functions ) shown in figures 16 and 17. Digital Boolean function whose values are 1 and 0 is the constraint in which we build the two continuous functions that define the parameters of the differential equation.

In a graphic way we have

Figure 18

Figure 18: input current and RC= T parameters function of p = V.

Click here to View figure

For the three states of the channels we have the system

Figure 19a

Figure 19: three steps example in complex channels interaction

Click here to View figure

We have the master equation

we have the graphic image of the implication or dependent Boolean function in (17)

Intention inside the neuron and logic [12]

To introduce intension inside the neuron and in the neural network we must associate with the intention a logic proposition in a classical and in many valued logic as activation function. In this way the neuron is constrained by intention so from conceptual space we realise intention by physical device. So for example given the two Boolean functions

As shown in figures (16), (17) they can be represented by differential equation in figure (14) of the neuron. Intention to obtain the two Boolean functions is realised in the physical world by the system in figure (14).

Figure 19

Figure 19: The connection between the input and the soma of the neuron is given by weighted element denoted synapse shown in this figure.

Click here to View figure

Conclusion

We know that formal neuron given by Maculloch and Pitts [12] was a very important step to understand the logic of the brain. To solve the problems of formal neuron we come back to natural neuron where the channel dynamics controls the neurons behavior. With the new revised study of natural neural network we try to model the activation state and the activation time in the open and closed channel process. Steady state in neuron and neural network is defined a priori to model intention represented by logic processes as AND, OR, IF and other types of Boolean function. Now each Boolean function is a set of spikes or truth that we can represent by a continuous function which has the max and min values in agreement with the logic function. Max value is one or true and Min value is zero or false. In this way when Boolean discrete or digital function is transformed into the continuous function of time and space we can use the natural neural differential equation whose steady state is the intention or continuous function described before. The continuous dependence function that is created in the soma can be codified by set of spikes to transmit by the axon the message to other neurons to give other possible aggregation to build more and more complex functions for more complex intention to be implemented in the physical world. We begin with a very simple channel master equation and after we move to a complex graph of channel mutual interaction to give a stronger instrument to obtain a physical image of the original conceptual intention.

Funding

The author(s) received no financial support for the research, authorship, and/or publication of this article.

Conflict of Interest

The authors do not have any conflict of interest.

Reference

  1. Yang Chunyan, Cai Wen. Extenics: Theory, Method and Application [M]. Beijing: Science Press, 2013 & Columbus: The Educational Publisher, 2013
  2. Cai Wen. Extension theory and its application [J]. Chinese Science Bulletin, 1999, 44 (17): 1538-1548.
    CrossRef
  3. Cai Wen, Yang Chunyan, Lin Weichu. Extension Engineering Methods [M]. Beijing: Science Press, 2003
  4. Chunyan YANG. Overview of extension innovation methods [A]. Communications in Cybernetics, Systems Science and Engineering [C]. CRC Press/Balkema, Taylor &Francis Group, London, UK, Extenics and Innovation Methods, 2013, pp 11-19
  5. Chunyan YANG, Zhiming LI. Recent Research Progress on Extension Data Mining Methods [A].Information Technology Applications in Industry [C]. Applied Mechanics and Materials, Vols. 263-266 (2013), pp 303-311
    CrossRef
  6. Chunyan Yang, Wen Cai. Knowledge Representations Based on Extension Rules [A]. WCICA 2008 DVD PROCEEDINGS [C], Chongqin, June 25-27, 2008: 1455-1459
    CrossRef
  7. Li Weihua, Yang Chunyan. Extension information-knowledge-strategy system for semantic interoperability [J].Journal of Computers, 2008, 3 (8): 32-39 (Academy Publisher )
    CrossRef
  8. Li Weihua, Yang Chunyan. Eliminate semantic conflicts in Web mining with extension methods [A]. Proceedings of 2007 International Conference on Systems, Man and Cybernetics [C]. Canada, 2007,10: 1183-1187
    CrossRef
  9. Germano Resconi, Xiaolin Xu and Guanglin Xu, Introduction to Morphogenetic Computing, Studies in computational Intelligence, volume 703, Springer, 2017
    CrossRef
  10. Alain Destexhe, John R. Huguenard, Computational Modelling Method for neuroscientists MIT press Scholarship online August 2013 chapter 5
  11. B.Wideow, Michael Lehr, 30 years of adaptive neural networks: Perceptron, Madalina and Backpropagation, proceeding of the IEEE 78(9), 1416-1442 1990
    CrossRef
  12. Warren S. McCulloch and Walter H. Pitts, A Logic calculus of the ideas immanent in the nervous activity, Bullettin of mathematical Biophysics, vol.5 1943 p.115-133
    CrossRef
  13. A. L. Hodgkin, F. R. S e A. F. Huxley, Propagation of electrical signals along giant nerve fibres, in Proc. R. Soc. Lond. B, vol. 140, nº 899, 16 ottobre 1952, pp. 177–183.
    CrossRef
  14. G.Resconi, One step Projection Method in Neural Network June 2016 DOI 10.13140/RG 2.12949.5927 Researchgate
  15. G.Resconi,Chunyan Yang, Solution of Brain Contradiction by extension theory, IFIP TC 12 ICIS 2018 24-29
    CrossRef
  16. G.Resconi, Xiaolin Xu, Guanglin Xu, Introduction to Morphogenetic Computing 19 May 2017 Springer Editor
    CrossRef

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.