Tag Archives: #Computer Vision

From AI (thinking) to AEI (feeling)

“Emotional intelligence (EI) is the capability of individuals to recognize their own emotions and those of others, discern between different feelings and label them appropriately, use emotional information to guide thinking and behavior, and manage and/or adjust emotions to adapt to environments or achieve one’s goal(s).”

— Colman, Andrew (2008). A Dictionary of Psychology (3 ed.). Oxford University Press. (source: Wikipedia)

Major strides have been made creating Artificial Intelligence (AI). Machines embedded with AI can do a lot, but they lack what is inherent in peopleEmotional Intelligence.

Emotional intelligence allows us recognize our own emotions and those of others and use this information to guide our thinking and behavior. Moving from AI to AEI is a work in progress.

Robot to talk

Image source: KQED

Continue reading From AI (thinking) to AEI (feeling)

(VIDEO) NTR LAB Rocks IT In Siberia

Want to rock your startup, but lack IT expertise?

Shortage of qualified developers? No time to build the right team? Have an idea, but aren’t technical? Startup needs been there/done that advice? Speed is everything; to develop fast you need new developers to both start and scale.

We are NTR Lab — a custom software development company. We can help with all the questions above. We offer solutions. We develop our clients’ ideas using our existing teams.

NTR Lab team

Continue reading (VIDEO) NTR LAB Rocks IT In Siberia

Why Do Neural Networks Need An Activation Function?

by Computer Vision Department of NTRLab 

Suppose we are given a set of distinct points P = {(xi, yi) ∈ ℝm ×ℝ}i=1,…,n which we regard as a set of test samples xi ∈ ℝm with known answers yi ∈ ℝ. To avoid non-compactness we may assume that P lie in some compact K, for example, K may be some polytope. Does there exist some continuous function in space of all C(K) continuous functions on K such that its graph is a good approximation for our set P in some sense?

From the approximation theory point of view, a neural network is a family of functions {Fθ, θ ∈ Θ} of some functional class. Each special neural network defines each own family of functions. Some of them might be equivalent in some sense. If we restrict ourselves to only MLP according to the above problem with only one intermediate layer consisting of N elements then the corresponding family of functions will be

FORMULA NN 12

Continue reading Why Do Neural Networks Need An Activation Function?