Прогноз погоды


John Smith

John Smith, 47

Joined: 28 January 2014

Interests: No data

Jonnathan Coleman

Jonnathan Coleman, 31

Joined: 18 June 2014

About myself: You may say I'm a dreamer

Interests: Snowboarding, Cycling, Beer

Andrey II

Andrey II, 40

Joined: 08 January 2014

Interests: No data



Joined: 05 August 2014

Interests: No data

David Markham

David Markham, 64

Joined: 13 November 2014

Interests: No data

Michelle Li

Michelle Li, 40

Joined: 13 August 2014

Interests: No data

Max Almenas

Max Almenas, 52

Joined: 10 August 2014

Interests: No data


29Jan, 30

Joined: 29 January 2014

Interests: No data

s82 s82

s82 s82, 25

Joined: 16 April 2014

Interests: No data


Wicca, 35

Joined: 18 June 2014

Interests: No data

Phebe Paul

Phebe Paul, 25

Joined: 08 September 2014

Interests: No data

Артем 007

Артем 007, 40

Joined: 29 January 2014

About myself: Таки да!

Interests: Норвегия и Исландия



Joined: 27 November 2018

Interests: No data

Alexey Geno

Alexey Geno, 7

Joined: 25 June 2015

About myself: Хай

Interests: Интерес1daasdfasf

Verg Matthews

Verg Matthews, 67

Joined: 25 June 2015

Interests: No data

Main article: Emotions

<< Back Forward >>
Topics from 1 to 10 | in all: 19

YC alum Modern Health, a startup focused on emotional wellbeing, gets $2.26M seed funding

17:00 | 15 June

Modern Health founders Alyson Friedensohn and Erica Johnson

About one year ago, a note from a CEO thanking his employee for using sick days to take care of her mental health went viral. It was a reminder to Alyson Friedensohn of what she wants to accomplish with Modern Health, the emotional health benefits startup she founded last year with neuroscientist Erica Johnson.

“We want that to be normal. We want the email she sent to be normal, to be able to be that open,” Friedensohn tells TechCrunch.

Modern Health, a Y Combinator alum, announced today that it has raised $2.26 million in seed funding for hiring, accelerating the development of its healthcare platform and growing its network of therapists, coaches and other providers. Offered as a benefit by companies, Modern Health’s services are meant to improve employee well-being and retention rates. The round was led by Afore, with participation from Social Capital, Precursor Ventures, Merus Capital, Maschmeyer Group Ventures, Y Combinator and angel investors.

Friedensohn, Modern Health’s chief executive officer, says six employers currently offer its platform, which includes services like counseling and career and financial coaching. One thing the startup is especially proud of is the fact that Modern Health’s team is currently all female and Friedensohn wants to parlay their point of view into services that address issues affecting women. For example, the platform already works with providers who specialize in postpartum depression and infertility.

“People don’t talk about what working moms are dealing with and countless things like that,” says Friedensohn, who previously worked at health tech companies Keas and Collective Health. “People don’t want to talk about it because they are worried it will jeopardize their careers, but it makes a difference.”

Several other tech startups are working on mental health care platforms for employers to offer as a benefit, including Ginger.io, Lyra Health and Quartet, which have all have received significant amounts of funding from prominent investors. The space is especially important, given the alarming rise in the United States’ suicide rate and the fact that about 6.7% of all adults in the U.S. have experienced at least one major depressive episode.

One of Modern Health’s priorities is to reach employees before they hit a crisis point. Since many people are daunted by the idea of therapy, the platform connects them to coaches instead to focus on specific issues, like their careers, or overall emotional wellbeing. This helps referrals, Friedensohn notes, because it makes the service feel more approachable.

“They can say to friends, I have this awesome Modern Health coach, versus saying I have a therapist, so it’s way easier for people to engage,” she says.

Modern Health also makes its services more accessible by offering several ways to use the platform: texting, video calls or, for people who don’t want to talk to a therapist or coach yet, meditation apps and other digital tools created by the company. One of Modern Health’s newest customers, human resources startup Gusto, hit a 43% utilization rate of its services, including connecting employees to coaches and therapists, among registered users just four days after it began offering the platform. Friedensohn adds that it’s not uncommon for people to write essays on their sign-up forms when registering because it’s the first time they’ve been able to unload their problems.

“People like that it’s coaching,” she says. “What we found is that by focusing on that point, the biggest thing is lowering the barrier to entry, so that people who are depressed are also comfortable reaching out.”



This jolly little robot gets goosebumps

01:30 | 17 May

Cornell researchers have made a little robot that can express its emotions through touch, sending out little spikes when its scared or even getting goosebumps to express delight or excitement. The prototype, a cute smiling creature with rubber skin, is designed to test touch as an I/O system for robotic projects.

The robot mimics the skin of octopi which can turn spiky when threatened.

The researchers, Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman, created the robot to experiment with new methods for robot interaction. They compare the skin to “human goosebumps, cats’ neck fur raising, dogs’ back hair, the needles of a porcupine, spiking of a blowfish, or a bird’s ruffled feathers.”

“Research in human-robot interaction shows that a robot’s ability to use nonverbal behavior to communicate affects their potential to be useful to people, and can also have psychological effects. Other reasons include that having a robot use nonverbal behaviors can help make it be perceived as more familiar and less machine-like,” the researchers told IEEE Spectrum.

The skin has multiple configurations and is powered by a computer-controlled elastomer that can inflate and deflate on demand. The goosebumps pop up to match the expression on the robot’s face, allowing humans to better understand what the robot “means” when it raises its little hackles or gets bumpy. I, for one, welcome our bumpy robotic overlords.



This eQuoo app games you to into learning useful psychological skills

19:59 | 9 March

Mental health is one of the behest issues of recent times, with Kendall Jenner, Emma Stone, Lady Gaga and even The Rock opening up about their mental health issues. Even the royal family has got in on the act, setting up the Heads Together charity. And it’s not just a fleeting issue. The World Health Organisation says depression will overtake cancer as the world’s main ‘global disease burden’ by 2030.

So it’s not surprising that meditation apps like Calm are going gangbusters amongst consumers. However, meditation and brain training apps are targeted towards cognitive skills. These skills can help with concentration and dementia, for example, but won’t allow you to have better and deeper relationships.

So it’s interesting that a new app has launched to tackle the thornier problem of how to improve your emotional intelligence. That’s the hope of new app startup eQuoo now launched on the Apple App store and Google Play.

eQuoo is an emotional fitness game that aims to teach you psychological skills in a fun and
engaging way. You get to use those skills in a choose your own adventure game, but they are
for real-life situations as well.

Founder Silja Litvin (a psychologist) says she was inspired to create the game because “emotional intelligence is more important than IQ when it comes to success in the workplace. If you know how and why you react and feel the way you do, you can navigate through stressful situations and relationships much better.”

Equoo has raised an Angel round, and the investors include Julian Pittam (Investor to Disciple Media, which just raised $4 million), Pierre Andurand and other angels.

As a child, Litvin moved from sunny Southern California to the less-than-sunny Luxembourg and was bullied at school. But things changed when her older sister got a body-language book for her birthday. “The idea that there was a science – a manual, so to speak – about why people did what completely blew my 12-year- old mind. That’s when I decided to become a psychologist: I wanted to spread the good news. eQuoo essentially helps you build up people skills.” Her team also includes, Med Buckey (CTO), Professor Markus Maier (Head of research) and James O’Brien (Lead developer).

In the game, a character, Dr Joy (named after Sigmund Freud – Freud is joy in German) walks you through the learning session. eQuoo allows you to practice the skills in the learning part of the game; unlock a level in the game which leads you to a choose your own adventure story where you need to use the skills to win; gives you feedback on your personality through the Big Five personality test’ let’s you share your personality feedback on social media; learn more about the skills you possess on a deeper level; and anything you learn becomes part of your psychological ‘tool-box’.

The startup opened the Beta on in Australia and New Zealand and has a weekly growth rate of 19% and a 4.8 rating in the app store. Phase one will be building advertisement and subscription, but after that, it plans to work with insurance companies as a mental health game that prevents anxiety and depression.



Evolve Foundation launches a $100 million fund to find startups working to relieve human suffering

01:58 | 4 November

It seems there’s nothing but bad news out there lately, but here’s some good news — the nonprofit Evolve Foundation has raised $100 million for a new fund called the Conscious Accelerator to combat loneliness, purposelessness, fear and anger spreading throughout the world though technology.

Co-founder of Matrix Partners China Bo Shao will lead the fund and will be looking for entrepreneurs focusing on tech that can help people become more present and aware.

“I know a lot of very wealthy people and many are very anxious or depressed,” he told TechCrunch. A lot of this he contributes to the way we use technology, especially social media networks.

“It becomes this anxiety-inducing activity where I have to think about what’s the next post I should write to get most people to like me and comment and forward,” he said. “It seems that post has you trapped. Within 10 minutes, you are wondering how many people liked this, how many commented. It was so addicting.”

Teens are especially prone to this anxiety, he points out. It turns out it’s a real mental health condition known as Social Media Anxiety Disorder (SMAD).

“Social media is the new sugar or the new smoking of this generation,” Shao told TechCrunch.

He quit social media in September of 2013 but tells TechCrunch he’s been on a journey to find ways to improve his life and others for the last 10 years.

His new fund, as laid out in a recent Medium post announcement, seeks to maximize the social good, find solutions to the issues now facing us through technology, not just investing in something with good returns.

Shao plans to use his background as a prominent VC in a multi-billion-dollar firm to find those working on the type of technology to make us less anxious and more centered.

The Conscious Accelerator has already funded a meditation app called Inside Timer. It’s also going to launch a parenting app to help parents raise their children to be resilient in an often confusing world.

He’s also not opposed to funding projects like the one two UC Berkeley students put together to identify Russian and politically toxic Twitter bots — something Twitter has been criticized for not getting a handle on internally.

“The hope is we will attract entrepreneurs who are conscious,” Shao said.

Featured Image: Yann Cœuru/Flickr UNDER A CC BY 2.0 LICENSE



New Affectiva cloud API helps machines understand emotions in human speech

16:00 | 13 September

Affectiva, the startup that spun out of the MIT Media Lab several years ago with tools designed to understand facial emotions, announced a new cloud API today that can detect a range of emotion in human speech.

When we speak, our voices offer subtle and not so subtle cues about our emotions. Whether our voices are tight or loud or soft can give valuable clues about our feelings. Humans can sometimes (although not always) detect those emotions, but traditionally computers have not been very good at it.

Alexa isn’t terribly funny because the technology doesn’t understand humor or tone, and can’t understand when you’re joking versus asking a genuine question. Using Affectiva’s new tech, voice assistants, bots and other devices that operate using artificial intelligence might soon be able to hear and understand our emotions — and be able to derive more meaning from our requests, company CEO and co-founder Dr. Rana el Kaliouby told TechCrunch.

“Amazon [and other companies] knows if it wants it to be persuasive to try a product or route, it needs to have a relationship [with you]. To have a relationship, it needs to understand your emotional state, which is what humans do, have a real-time understanding of an emotional state. Are you annoyed, frustrated, confused?,” Kaliouby explained.

Amazon isn’t alone. Car makers are interested in knowing your emotional state behind the wheel, and that of your passengers. These factors could have an impact on your safety in the car. Any company could use a better understanding of customers calling into their call centers or dealing with a customer service bot (they would find me often annoyed).

About a year ago, the company decided to begin studying how a machine might be able to detect an emotion based on the quality of the spoken voice. This is no easy task. There are different languages and a variety of cultural cues, which aren’t necessarily standard from country to country and culture to culture.

The company has been collecting data in the public domain and from its own data sets related to the emotional facial recognition research from around the world. They have teams of people listening to each test subject and identifying the emotion. To avoid bias, each labeler goes through a training program, and for each item in the test set, at least three of five testers have to agree on the emotional state, she said.

Affectiva understands that the data they have gathered to this point is only the beginning. Today’s announcement around the API is also about getting partners to help push this work further along. “We are starting with a crowd-based API because we are looking for data partners interested in partnering around data and emotion classifiers,” she said.

All of this research is great in theory, but there are also many ethical questions related to machines detecting our emotions in our faces and our speech, and Kaliouby understands this. They have strict guidelines about gathering information and how they use it.

They are also running a one-day Emotion AI Summit today in Cambridge at the MIT Media Lab where a variety of speakers will discuss the implications for this kind of technology on society.

Featured Image: Flashpop/Getty Images



Curious to crack the neural code? Listen carefully

22:30 | 10 July

Perhaps for the first time ever, our noggins have stolen the spotlight from our hips. The media, popular culture, science and business are paying more attention to the brain than ever. Billionaires want to plug into it.

Scientists view neurodegenerative disease as the next frontier after cancer and diabetes, hence increased efforts in understanding the physiology of the brain. Engineers are building computer programs that leverage “deep learning” which mimic our limited understanding of the human brain. Students are flocking to universities to learn how to build artificial intelligence for everything from search to finance to robotics to self-driving cars.

Public markets are putting premiums on companies doing machine learning of any kind. Investors are throwing bricks of cash at startups building or leveraging artificial intelligence in any way.

Silicon Valley’s obsession with the brain has just begun. Futurists and engineers are anticipating artificial intelligence surpassing human intelligence. Facebook is talking about connecting to users through their thoughts. Serial entrepreneurs Bryan Johnson and Elon Musk have set out to augment human intelligence with AI by pouring tens of millions of their own money into Kernel and Neuralink, respectively.

Bryan and Elon have surrounded themselves with top scientists, including Ted Berger and Philip Sabes, to start by cracking the neural code. Traditionally, scientists seeking to control machines through our thoughts have done so by interfacing with the brain directly, which means surgically implanting probes into the brain.

Scientists Miguel Nicolelis and John Donoghue were among the first to control machines directly with thoughts through implanted probes, also known as electrode arrays. Starting decades ago, they implanted arrays of approximately 100 electrodes into living monkeys, which would translate individual neuron activity to the movements of a nearby robotic arm.

The monkeys learned to control their thoughts to achieve desired arm movement; namely, to grab treats. These arrays were later implanted in quadriplegics to enable them to control cursors on screens and robotic arms. The original motivation, and funding, for this research was intended for treating neurological diseases such as epilepsy, Parkinson’s and Alzheimer’s.

Curing brain disease requires an intimate understanding of the brain at the brain-cell level.  Controlling machines was a byproduct of the technology being built, in addition to access to patients with severe epilepsy who already have their skulls opened.

Science-fiction writers imagine a future of faster-than-light travel and human batteries, but assume that you need to go through the skull to get to the brain. Today, music is a powerful tool for conveying emotion, expression and catalyzing our imagination. Powerful machine learning tools can predict responses from auditory stimulation, and create rich, Total-Recall-like experiences through our ears.

If the goal is to upgrade the screen and keyboard (human<->machine), and become more telepathic (human<->human), do we really need to know what’s going on with our neurons? Let’s use computer engineering as an analogy: physicists are still discovering elementary particles; but we know how to manipulate charge good enough to add numbers, trigger pixels on a screen, capture images from a camera and capture inputs from a keyboard or touchscreen. Do we really need to understand quarks? I’m sure there are hundreds of analogies where we have pretty good control over things without knowing all the underlying physics, which begs the question: Do we really have to crack the neural code to better interface with our brains?

In fact, I would argue that humans have been pseudo-telepathic since the beginning of civilization, and that has been in the art of expression. Artists communicate emotions and experiences by stimulating our senses, which already have a rich, wide-band connection to our brains. Artists stimulate our brains with photographs, paintings, motion pictures, song, dance and stories. The greatest artists have been the most effective at invoking certain emotions; love, fear, anger and lust to name a few. Great authors empower readers to live though experiences by turning the page, creating richer experiences than what’s delivered through IMAX 3D.

Creative media is a powerful tool of expression to invoke emotion and share experiences. It is rich, expensive to produce and distribute, and disseminated through screens and speakers. Although virtual reality promises an even more interactive and immersive experience, I envision a future where AI can be trained to invoke powerful, personal experiences through sound. Images courtesy of Apple and Samsung.

Unfortunately those great artists are few, and heavy investment goes into a production which needs to be amortized over many consumers (1->n). Can content be personalized for EVERY viewer as opposed to a large audience (1->1)?  For example, can scenes, sounds and smells be customized for individuals? To go a step further; can individuals, in real time, stimulate each other to communicate emotions, ideas and experiences (1<->1)?

We live our lives constantly absorbing from our senses. We experience many fundamental emotions such as desire, fear, love, anger and laughter early in our lifetimes. Furthermore, our lives are becoming more instrumented by cheap sensors and disseminated through social media. Can a neural net trained on our experiences, and emotional responses to them, be able to create audio stimulation that would invoke a desired sensation? This “sounds” far more practical than encoding that experience in neural code and physically imparting it to our neurons.

Powerful emotions have been inspired by one of the oldest modes of expression: music. Humanity has invented countless instruments to produce many sounds, and, most recently, digitally synthesized sounds that would be difficult or impractical to generate with a physical instrument. We consume music individually through our headphones, and experience energy, calm, melancholy and many other emotions. It is socially acceptable to wear headphones; in fact Apple and Beats by Dre made it fashionable.

If I want to communicate a desired emotion with a counterpart, I can invoke a deep net to generate sounds that, as per a deep net sitting on my partners’ device, invokes the desired emotion. There are a handful of startups using AI to create music with desired features that fit in a certain genre. How about taking it a step further to create a desired emotion on behalf of the listener?

The same concept can be applied for human-to-machine interfaces, and vice-versa. Touch screens have already almost obviated keyboards for consuming content; voice and NLP will eventually replace keyboards for generating content.

Deep nets will infer what we “mean” from what we “say.” We will “read” through a combination and hearing through a headset and “seeing” on a display. Simple experiences can be communicated through sound, and richer experiences through a combination of sight and sound that invoke our other senses; just as how the sight of a stadium invokes the roar of a crowd and the taste of beer and hot dogs.

Behold our future link to each other and our machines: the humble microphone-equipped earbud. Initially, it will tether to our phones for processing, connectivity and charge. Eventually, advanced compute and energy harvesting would obviate the need for phones, and those earpieces will be our bridge to every human being on the planet, and access corpus of human knowledge and expression.

The timing of this prediction may be ironic given Jawbone’s expected shutdown. I believe that there is a magical company to be built around what Jawbone should have been: an AI-powered telepathic interface between people, the present, past and what’s expected to be the future.

Featured Image: Ky/Flickr UNDER A CC BY 2.0 LICENSE



Accenture, can I ask you a few questions?

21:30 | 12 April

Hey Accenture, are you aware that your PR firm is pitching your latest corporate beta for a creepy face and emotion-monitoring algorithm as a party trick?

Do you know what a party is? Have you read the definition?

Would you call requiring people to download an app before they can get into an event a party-time kind of thing to do? Would you say that demanding that people scan their faces so they can be recognized by an algorithm fun times?

Does having that same algorithm watch and record every interaction that happens in the Austin bar or event space you’ve filled with some light snacks and a bunch of free liquor count as rockin like Dokken?

Do good hosts require people to become lab rats in the latest attempt to develop HAL?

Is monitoring patients’ faces in hospitals really the best way to apply the technology outside of your PartyBOT’s par-tays? Or is that also a creepy and intrusive use of technology, when other solutions exist to track actual vital signs?

What do your own office parties look like? Does Sam from the front desk have to try out the newest accounting software to get a drink from the punch bowl? Does Diane have to swear allegiance to Watson before grabbing that tuna roll? Do you throw them at Guy’s American Kitchen & Bar?

Does your soul die a little when you turn people into test subjects for the AI apocalypse?

Maybe after reading this press release, it should?

With the rise of AI and voice recognition, customer experiences can be curated to the next level. Imagine Amazon’s Alexa, but with more emotion, depth and distinction. Accenture Interactive’s PartyBOT is not simply a chatbot – it is equipped with the latest technologies in facial recognition that allow the bot to recognize user feelings through facial expressions, and words resulting in more meaningful conversations.

Featured at this year’s SXSW, the PartyBOT delivered an unparalleled party experience for our guests – detecting their preferences from favorite music, beverages and more. The PartyBOT went so far as to check in on every attending guest at the party, curating tailored activities based on their preferences. Link to video:

(pw: awesome)

But the PartyBOT goes much further than facilitating great party experiences. Its machine learning applications can apply to range of industries from business to healthcare – acting as an agent to support patient recognition and diagnosis in hospitals that can recognize patient distress and seek the appropriate help from doctors.

If you would like to learn more about the PartyBOT, I’m happy to put you in touch with our executives to discuss the applications of our technology and potentially schedule time to see this in our studios.

Featured Image: Rosenfeld Media/Flickr UNDER A CC BY 2.0 LICENSE



Immigrant eyes

18:10 | 19 March

Over the past few years we’ve seen a lot of anger. We’ve seen a seemingly sane country cut itself off from mainland Europe and we’ve seen a seemingly beneficent border police force turn angry. We’ve heard that immigrants steal our jobs, kill our people, and bring in drugs and terror.

This is wrong.

On the pro-side we know that immigrants, as a whole, commit less crime than US born citizens. On the side against immigration we get a few vignettes of terror that pretend to paint a whole picture. Further, we must understand the plight of the worker in America and Britain. Immigrants do take jobs and the white-collar world can’t see the effects. Antonio Garcia-Martinez, author of Chaos Monkeys, said it best when he wrote:

Blue Staters ridicule working-class Red Staters’ support for Trump, and his rhetoric around building a wall. Of course, white collar workers already have their wall: it’s called the H1 visa, and it means they don’t have to compete with every graduate of IIT or Tsinghua for a job. Imagine for a moment if there were a scrum of diligent and capable Chinese and Indian engineers in front of Facebook and Google (as there is a scrum of Mexicans outside every Home Depot in the US), each ready to take the job for less than the coddled American inside stuffing his face with the free ribs. What would that engineer’s opinion on illegal immigration be then?

What the Red Staters are asking for is essentially the same guarantees around a limited flow of outside labor that the Blue Staters already enjoy.

As always, it’s not enlightened ideals that define political opinions, it’s power and self-interest. White collar workers have the views they do because they’ve already gotten theirs, and they don’t care about the high-school-educated plumber or construction worker in Louisville or Des Moines, who sees that Mexican outside Home Depot (rightly) as a threat to his livelihood.

Everyone is right and no one is. High tech is blind to the plight of the migrant worker and depends on the largesse of governments for programmers and outsourced data centers. Everyone, from the guy in a truck at Home Depot to the woman in an Uber on the Golden Gate, must take stock and try to help.

Ultimately immigration is a change agent and a necessity. Populations age. Cultures shift. New technologies will fix old failures. And throughout it all the sane and accepted migration of the skilled and unskilled, refugees and expats, will get us through. To blame our ills on those different from us is a failure of humanity and this failure has repeated itself over and over again since the Dark Ages. We fear – but need – the Other. That fear must be excised.

In the end I can only recommend one or two things to fix this in the short term. First, understand your own roots and your own path and try to help others in your same course. My grandparents were Polish and Hungarian. I’ve tried, in my own way, to boost those countries as much as I can, knowing full well that their economies and ecosystems are in full bloom and they don’t need much help. But what they need is investment and attention. I can give them that.

We can also help spread the vision of entrepreneurship throughout the world. A few weeks ago I met a group of Cuban entrepreneurs who were visiting Denver and they stopped in at Boomtown to see what the accelerator experience was like. These were men and women who, desire the odds and political machinations, were building their country’s Internet infrastructure. They loved seeing how the accelerator helped young companies grow and I think their eyes were opened to new possibilities. They’ll be attending Disrupt in May, as well. It was the least I could do for them.

But the thing that struck me most was how similar they were to every entrepreneur I meet from Zagreb to Alameda. All had a clearness of vision, all were ready to work and change. All were ready to move forward despite all odds. They aren’t here to steal jobs, they’re here to make jobs. Their not here to break the body politic but to strengthen the good and cleave away the bad. They are here to fix things at home and abroad.

They are not immigrants. They are people. They move from place to place, bringing some bad but more good. They will never stop. They will never give up. It’s best to harness that energy than to stifle it.

I was at a wedding in Pittsburgh a few years ago when I heard a story from an older friend who heard about my Polish roots. Three girls, eight, ten, and fourteen, came over from Warsaw by way of Gdansk in about 1900. They sailed with their father, a blacksmith and drinker, and arrived in New York shaken and sick. They took an overland route to Coal Country and settled outside of Pittsburgh. Their father went to work in the mines and the girls went to school. One afternoon, when they came home, their father was gone.

He had gone back to Poland, leaving the girls alone. They didn’t know why he left – he had a half-baked plan to bring his wife over – and he din’t leave much money or even a note. The girls took jobs sewing and cleaning and the oldest took care of the middle child who took care of the youngest. All went to school when they could. The neighborhood, made up mostly of immigrants, tried to helped and the girls made a life for themselves. Unbeknownst to them, their parents died back in Poland which is why their letters home went unanswered. It is a tale of penury, perdition, and fear, of babes in the woods and, potentially, failure.

But it didn’t end in a nightmare.

The girls grew and married. They built lives in a new world into which they were thrust, like babies dipped howling into the baptismal font in Pittsburgh’s Kościół Matki Boskiej, the shock of the future coming over them in a rush. And they made it. Their father, for all his faults, knew they would be safe. The world where his daughters now lived was not nearly as dangerous as the world he left and the potential for survival was far greater. He trusted his girls to the tide of immigration and he was not wrong.

“That was my grandmother, the youngest one,” said my friend. He is now an engineer in Pittsburgh, an American.

We forget that we all came from afar, with nothing. And we cannot fear those who follow behind us.

Featured Image: jvoves/Flickr UNDER A CC BY 2.0 LICENSE



The Emotion Journal performs real-time sentiment analysis on your most personal stories

16:06 | 4 December

Andrew Greenstein, an app developer from San Francisco, started journaling a few months ago. He tries to write for five minutes every day, but it’s challenging to set aside the time. Still, he’s read that journaling reduces stress and can help with goal-setting, so he’s trying to make it a habit.

At the Disrupt London Hackathon, Greenstein and his team built The Emotion Journal, a voice journaling app that performs real-time emotional analysis to detect the user’s feelings and chart their emotional state over time. (A small aside here: when you check out The Emotion Journal’s website, you might get a security warning — this is because the site is built to have a secure https connection but the developers haven’t paid for a certificate yet.)

By day, Greenstein is the CEO of SF AppWorks, a digital agency. But he and his co-founder, Darius Zagrean, have become increasingly interested in artificial intelligence and how it can be used to address mental health issues.

“I’d like to continue working on it because this stuff to me, the connection between human and computer, is so fascinating,” Greenstein said.

The idea to apply AI to mental health challenges first came to Greenstein and his team during an internal AppWorks hackathon, during which one team built an app to roleplay anxiety-inducing situations and improve the user’s response to his or her anxiety triggers. “We wanted to get into AI as much as we could. It’s so clear that’s where the world is going. This one had the emotional factor that really grabbed me,” Greenstein explained.

He decided to continue exploring the intersection of AI and emotion at the London Hackathon and built The Emotion Journal using IBM Watson to perform the real-time sentiment analysis. A user talks about their day and The Emotion Journal reacts quickly, changing color in response to the user’s feelings.

The Emotion Journal also stores and color-codes logs over time, so a user can see at a glance what their recent emotions have been. “We talked a lot about how journaling needs to be more indexable, more searchable. Maybe you can learn something from a past session,” Greenstein said.

Although The Emotion Journal is a promising merger between AI and journaling that might teach us more about our own mental health, it might not be around forever — Greenstein hopes to keep the project up in a demo state. So go check it out now while you still can.



Emotionally intelligent computers may already have a higher EQ than you

04:00 | 3 December

Andrew Thomson Crunch Network Contributor

Andrew Thomson is founder and CEO of VentureRadar, a London-based technology scouting startup that uses big data to connect companies with clients.

More posts by this contributor:
  • 7 Unexpected Virtual Reality Use Cases
  • Using The Blockchain To Fight Crime And Save Lives
How to join the network

From I, Robot to Ex Machina to Morgan, the idea of creating robots that can understand, compute and respond to human emotions has been explored in movies for decades. However, a common misconception is that the challenge of creating emotionally intelligent computing systems is too great to be met any time soon. In reality, computers are already demonstrating they can augment — or even replace — human emotional intelligence (EQ).

Perhaps, surprisingly, it is the lack of emotion in computing systems that places them in a such a good position to be emotionally intelligent — unlike humans, who aren’t always particularly good at reading others, and are prone to missing emotional signals or being fooled by lies.

According to Tomas Chamorro-Premuzic, “robots do not need to be able to feel in order to act in an emotionally intelligent manner. In fact, contrary to what people think, even in humans high EQ is associated with lower rather than higher emotionality. [High EQ] is about controlling one’s impulses and inhibiting strong emotions in order to act rationally and minimize emotional interference.”

In the field of affective computing, sensors and other devices are also getting very good at observing and interpreting facial features, body posture, gestures, speech and physical states, another key ingredient in emotional intelligence. Innovative companies across a range of industries are now using computing systems that can augment, and even improve on, human emotional intelligence.


In the high pressure environment of Wall Street, stock traders hold power over millions of dollars of their employer’s money, and split-second decisions can make or break careers.

The emotional state of employees can determine if they are at an increased risk of making a costly mistake, or if they have just made one. Historically, the management culture in some industries hasn’t always optimized for considering the emotional well-being of employees.
However, leading banks like JPMorgan Chase and Bank of America are now working with tech companies to put systems in place to monitor worker emotions and boost performance and compliance.

According to Bloomberg, a number of banks have partnered with Humanyze, a startup founded by MIT graduates that produces sensor-laden badges that transmit in real time data on speech, activity and stress patterns. While it may sound like a scene from Orwell’s 1984, the badges also contain microphones and proximity sensors that can help employers improve productivity in teams by analyzing behavioral data. The devices would allow managers to assist employees who are “out of their depth” and take decisive action, and also to highlight positive behavior, which can be used in team training.

Considerate driving

If you’ve ever sat with white knuckles in the back of a taxi as your driver swerves through traffic, then you’re probably quite excited about the prospect of “self-driving” cars that are programmed to follow the rules and be safe. As autonomous vehicles begin to replace manned vehicles, our robot drivers may be a whole lot more responsive to how you feel about their driving.

BRAIQ is a startup that is teaching autonomous vehicles how to read the comfort level of its passengers and learn to drive the way they prefer. This personalization is intended to both help increase passenger comfort as well as foster trust in self-driving technology.

Off-the-shelf in-cabin sensors provide data on how the passengers feel about the car’s actions — such as acceleration, braking and steering. The collected biometric data is aggregated and analyzed, resulting in an AI whose driving style is responsive to a passenger’s comfort. BRAIQ’s software is effectively adding a layer of emotional intelligence on top of artificial intelligence.

Computers are already demonstrating they can augment — or even replace — human emotional intelligence.

New tech is also being created that will teach self-driving cars to communicate their intentions. To replace a wave of the hand to let someone pass in front or your car, or a flash of lights on the highway to let others know you are going to pass, Drive.ai has created a deep learning AI for driver-less cars that allows vehicles to signal their intentions to humans through lights, sounds and movement.

The new tech uses deep learning programming to assess what is going on around the car via sensors, and react appropriately to the situation. To effectively interact with pedestrians and other drivers, the cars could learn to use movements and sounds to indicate their next actions; for example, flashing lights to allow someone to pass, or rocking back and forward to indicate it will move forward.

Customer service

Cogito analyzes speaking patterns and conversational dynamics between call center agents and customers, providing real-time guidance to help phone professionals better engage and connect with customers.

Agents are guided to speak with more empathy, confidence, professionalism and efficiency, depending on the emotion detected through callers’ speech, while early signs of customer frustration and intent to purchase help improve service and close deals. Real-time dashboards enable supervisors to monitor and proactively intervene in live calls. Supervisors are automatically alerted to calls in which a customer is having a poor experience.

Cogito’s analytics provide objective insight into agents’ speaking behavior and customer experience on every call, and live customer experience scores help identify actionable best practices and trends for future training exercises.

Law enforcement

Law enforcement and government agencies around the world still use polygraph “lie detectors.” However, many experts question the continued use of the technology, arguing that the polygraph machines are inaccurate and can be tricked.

Related Articles

Affectiva and Uber want to brighten your day with machine learning and emotional intelligence Affectiva raises $14 million to bring apps, robots emotional intelligence Nanit knows more about how your baby sleeps than you do

Nuralogix has created technology that reads emotional reactions that aren’t notable to the human eye. Using a mix of Transdermal Optical Imaging and advanced machine learning algorithms, the technology assesses facial blood flow information to reveal hidden human emotions. In a law enforcement setting, officials would be able to ask direct questions and then assess the respondent’s true emotions based on an element they cannot physically control — the blood flow in their faces.

In a similar vein, researchers at MIT just announced EQ-Radio, a device that, the creators claim, can assess a user’s feeling at that moment with an accuracy of 87 percent. The device reflects wireless signals off a person’s body, then uses algorithms to document individual heartbeats and breathing patterns, and the levels of brain arousal, according to a report. To date, the technology has only been used to assess whether a participant is happy, sad or angry, but, as the technology develops, it could be trained to be used in a similar way to a polygraph test.

Although it might be creepy to think that in the future we will be monitored by machines that can detect our emotions, computing systems with emotional intelligence are already surpassing human capabilities. Far from being stuck in the realm of science fiction, they could soon be a reality in our homes, cars and offices.

Featured Image: Bryce Durbin/TechCrunch


<< Back Forward >>
Topics from 1 to 10 | in all: 19

Site search

Last comments

Walmart retreats from its UK Asda business to hone its focus on competing with Amazon
Peter Short
Good luck
Peter Short

Evolve Foundation launches a $100 million fund to find startups working to relieve human suffering
Peter Short
Money will give hope
Peter Short

Boeing will build DARPA’s XS-1 experimental spaceplane
Peter Short
Peter Short

Is a “robot tax” really an “innovation penalty”?
Peter Short
It need to be taxed also any organic substance ie food than is used as a calorie transfer needs tax…
Peter Short

Twitter Is Testing A Dedicated GIF Button On Mobile
Peter Short
Sounds great Facebook got a button a few years ago
Then it disappeared Twitter needs a bottom maybe…
Peter Short

Apple’s Next iPhone Rumored To Debut On September 9th
Peter Short
Looks like a nice cycle of a round year;)
Peter Short

AncestryDNA And Google’s Calico Team Up To Study Genetic Longevity
Peter Short
I'm still fascinated by DNA though I favour pure chemistry what could be
Offered is for future gen…
Peter Short

U.K. Push For Better Broadband For Startups
Verg Matthews
There has to an email option icon to send to the clowns in MTNL ... the govt of India's service pro…
Verg Matthews

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short