Tag Archives: #Neural Networks

Share Our TOPs

International Programmers’ Day was last week; today we give a special shout-out to those programmers who developed the apps that allow companies to communicate easily and provide services to clients from around the world.

NTR Lab is one of those companies; we work remotely using multiple technologies to communicate and collaborate with companies in dozens of countries. We also connect with them in person during road trips taken specifically to meet them.

Today we would like to tell more about life of our software development company and present the most popular and interesting stuff happened recently. We prepared list post with our TOPs.

Continue reading Share Our TOPs

Why Do Neural Networks Need An Activation Function?

by Computer Vision Department of NTRLab 

Suppose we are given a set of distinct points P = {(xi, yi) ∈ ℝm ×ℝ}i=1,…,n which we regard as a set of test samples xi ∈ ℝm with known answers yi ∈ ℝ. To avoid non-compactness we may assume that P lie in some compact K, for example, K may be some polytope. Does there exist some continuous function in space of all C(K) continuous functions on K such that its graph is a good approximation for our set P in some sense?

From the approximation theory point of view, a neural network is a family of functions {Fθ, θ ∈ Θ} of some functional class. Each special neural network defines each own family of functions. Some of them might be equivalent in some sense. If we restrict ourselves to only MLP according to the above problem with only one intermediate layer consisting of N elements then the corresponding family of functions will be

FORMULA NN 12

Continue reading Why Do Neural Networks Need An Activation Function?