Tech Networks

 
Network Basics
An Artificial Neural Network
Thursday, November 20, 2008
An artificial neural network (ANN) or simply "neural network" (NN), is a mathematical model or computational model based on biological neural networks. It consists of an interconnected group of artificial neurons and processes information using a connectionist approach to computation. In most cases an ANN is an adaptive system that changes its structure based on external or internal information that flows through the network during the learning phase.
In more practical terms neural networks are non-linear statistical data modeling tools. They can be used to model complex relationships between inputs and outputs or to find patterns in data.



The network in artificial neural network

The word network in the term 'artificial neural network' arises because the function f(x) is defined as a composition of other functions gi(x), which can further be defined as a composition of other functions। This can be conveniently represented as a network structure, with arrows depicting the dependencies between variables। A widely used type of

composition is the nonlinear weighted sum, where , where K is some predefined function, such as the hyperbolic tangent। It will be convenient for the following to refer to a collection of functions gi as simply a vector







ANN dependency graph


The figure depicts such a decomposition of f, with dependencies between variables indicated by arrows. These can be interpreted in two ways.
The first view is the functional view: the input x is transformed into a 3-dimensional vector h, which is then transformed into a 2-dimensional vector g, which is finally transformed into f. This view is most commonly encountered in the context of optimization.
The second view is the probabilistic view: the random variable F = f(G) depends upon the random variable G = g(H), which depends upon H = h(X), which depends upon the random variable X. This view is most commonly encountered in the context of graphical models.
The two views are largely equivalent. In either case, for this particular network architecture, the components of individual layers are independent of each other (e.g., the components of g are independent of each other given their input h). This naturally enables a degree of parallelism in the implementation.




Recurrent ANN dependency graph

Networks such as the previous one are commonly called feedforward, because their graph is a directed acyclic graph. Networks with cycles are commonly called recurrent. Such networks are commonly depicted in the manner shown at the top of the figure, where f is shown as being dependent upon itself. However, there is an implied temporal dependence which is not shown. What this actually means in practice is that the value of f at some point in time t depends upon the values of f at zero or at one or more other points in time. The graphical model at the bottom of the figure illustrates the case: the value of f at time t only depends upon its last value.
posted by Nagraj Mudaliar @ November 20, 2008   0 comments
Artificial neural network

An artificial neural network (ANN) or simply "neural network" (NN), is a mathematical model or computational model based on biological neural networks. It consists of an interconnected group of artificial neurons and processes information using a connectionist approach to computation. In most cases an ANN is an adaptive system that changes its structure based on external or internal information that flows through the network during the learning phase.

In more practical terms neural networks are non-linear statistical data modeling tools. They can be used to model complex relationships between inputs and outputs or to find patterns in data.

The network in artificial neural network

The word network in the term 'artificial neural network' arises because the function f(x) is defined as a composition of other functions gi(x), which can further be defined as a composition of other functions This can be conveniently represented as a network structure, with arrows depicting the dependencies between variables A widely used type of composition is

of composition is the nonlinear weighted sum, where , where K is some predefined function, such as the hyperbolic tangent. It will be convenient for the following to refer to a collection of functions gi as simply a vector










ANN dependency graph

The figure depicts such a decomposition of f, with dependencies between variables indicated by arrows. These can be interpreted in two ways.

The first view is the functional view: the input x is transformed into a 3-dimensional vector h, which is then transformed into a 2-dimensional vector g, which is finally transformed into f. This view is most commonly encountered in the context of optimization.

The second view is the probabilistic view: the random variable F = f(G) depends upon the random variable G = g(H), which depends upon H = h(X), which depends upon the random variable X. This view is most commonly encountered in the context of graphical models.

The two views are largely equivalent. In either case, for this particular network architecture, the components of individual layers are independent of each other (e.g., the components of g are independent of each other given their input h). This naturally enables a degree of parallelism in the implementation.


Recurrent ANN dependency graph

Networks such as the previous one are commonly called feedforward, because their graph is a directed acyclic graph. Networks with cycles are commonly called recurrent. Such networks are commonly depicted in the manner shown at the top of the figure, where f is shown as being dependent upon itself. However, there is an implied temporal dependence which is not shown. What this actually means in practice is that the value of f at some point in time t depends upon the values of f at zero or at one or more other points in time. The graphical model at the bottom of the figure illustrates the case: the value of f at time t only depends upon its last value.

posted by Nagraj Mudaliar @ November 20, 2008   0 comments
wi -Fi
Wednesday, November 19, 2008
Wi-Fi is the trade name for the popular wireless technology used in home networks, mobile phones, video games and other electronic devices that require some form of wireless networking capability. In particular, it covers the various IEEE 802.11 technologies

Wi-Fi is to provide wireless access to digital content. This content may include applications, audio and visual media, Internet connectivity, or other data. Wi-Fi generally makes access to information easier, as it can eliminate some of the physical restraints of wiring; this can be especially true for mobile devices.


Wi-Fi allows local area networks (LANs) to be deployed without cabling for client devices, typically reducing the costs of network deployment and expansion. Spaces where cables cannot be run, such as outdoor areas and historical buildings, can host wireless LANs.

Wireless network adapters are now built into most laptops. The price of chipsets for Wi-Fi continues to drop, making it an economical networking option included in even more devices. Wi-Fi has become widespread in corporate infrastructures.

Different competitive brands of access points and client network interfaces are inter-operable at a basic level of service. Products designated as "Wi-Fi Certified" by the Wi-Fi Alliance are backwards compatible. Wi-Fi is a global set of standards. Unlike mobile telephones, any standard Wi-Fi device will work anywhere in the world.
posted by Nagraj Mudaliar @ November 19, 2008   0 comments
About Me
Name: Tech Networks
See my complete profile
Previous Post
Archives
Powered by

Free Blogger Templates

BLOGGER

Links
submit blog
Copyright ©2007 Tech Netwoks