4.1. Space complexity
For the WSN–SOM design, a generic mote with an embedded neuron stores a number of data structures and parameters locally. These include the neuron weight vector of reals with dimension NIL (which is the number of features for training patterns), a second vector of reals with dimension of not more than NOL (which is the number of output layer neurons in SOM) for storing the outputs from connected neurons, and the set of parameters, CParams, associated with the initialization, the training and the computational model of neuron dynamics. In comparison, the single supervisory mote stores the entire set of P training patterns each with dimensionality of NIL at a memory cost of O(P×NIL). Wireless communication channels serve as connections among a transmitting neuron and receiving neurons to exchange neuron outputs. In quantitative terms, the space or memory cost for a generic mote is given by O(NIL+NOL+|CParams|). The space complexity for a generic mote then depends mainly on the dimension of the feature space since it is the dominant term among those contributing to the space complexity. Therefore, the memory cost for a generic mote is linear in the number of features of the training patterns for the WSN–SOM neural network. The space or memory cost for the supervisory mote is, however, on the order of the number of training patterns multiplied by the dimensionality of patterns in the data set.
4.2. Time complexity
The time complexity of the proposed computing system is determined by a number of factors related to WSN design parameters. Additionally, there are two distinct phases for the SOM algorithm: the training that bears a substantial time cost and the deployment whose time cost tends to be negligible compared to that of the training.
Considering the pseudocode for SOM training in Fig. 11, a set of timing parameters are defined in Table 6 to facilitate the time complexity analysis. Compared to the centralized or non-distributed implementation, the time complexity for the distributed WSN–SOM implementation has three major aspects. The WSN–SOM implementation offers reduction in processing time needed for updating network neuron dynamics and weights through parallelism; results in increase in time cost due to various communication requirements among neurons; and has comparable time cost for other phases such as the process of determining the BMU