Прогноз погоды

People

John Smith

John Smith, 47

Joined: 28 January 2014

Interests: No data

Jonnathan Coleman

Jonnathan Coleman, 31

Joined: 18 June 2014

About myself: You may say I'm a dreamer

Interests: Snowboarding, Cycling, Beer

Andrey II

Andrey II, 39

Joined: 08 January 2014

Interests: No data

David

David

Joined: 05 August 2014

Interests: No data

David Markham

David Markham, 63

Joined: 13 November 2014

Interests: No data

Michelle Li

Michelle Li, 39

Joined: 13 August 2014

Interests: No data

Max Almenas

Max Almenas, 51

Joined: 10 August 2014

Interests: No data

29Jan

29Jan, 30

Joined: 29 January 2014

Interests: No data

s82 s82

s82 s82, 25

Joined: 16 April 2014

Interests: No data

Wicca

Wicca, 35

Joined: 18 June 2014

Interests: No data

Phebe Paul

Phebe Paul, 25

Joined: 08 September 2014

Interests: No data

Артем 007

Артем 007, 40

Joined: 29 January 2014

About myself: Таки да!

Interests: Норвегия и Исландия

Alexey Geno

Alexey Geno, 7

Joined: 25 June 2015

About myself: Хай

Interests: Интерес1daasdfasf

Verg Matthews

Verg Matthews, 66

Joined: 25 June 2015

Interests: No data

CHEMICALS 4 WORLD DEVEN DHELARIYA

CHEMICALS 4 WORLD…, 32

Joined: 22 December 2014

Interests: No data



Main article: Column

<< Back Forward >>
Topics from 1 to 10 | in all: 4200

The future of AI relies on a code of ethics

01:00 | 22 June

Facebook has recently come under intense scrutiny for sharing the data of millions of users without their knowledge. We’ve also learned that Facebook is using AI to predict users’ future behavior and selling that data to advertisers. Not surprisingly, Facebook’s business model and how it handles its users’ data has sparked a long-awaited conversation — and controversy — about data privacy. These revelations will undoubtedly force the company to evolve their data sharing and protection strategy and policy.

More importantly, it’s a call to action: We need a code of ethics.

As the AI revolution continues to accelerate, new technology is being developed to solve key problems faced by consumers, businesses and the world at large. It is the next stage of evolution for countless industries, from security and enterprise to retail and healthcare. I believe that in the near future, almost all new technology will incorporate some form of AI or machine learning, enabling humans to interact with data and devices in ways we can’t yet imagine.

Moving forward, our reliance on AI will deepen, inevitably causing many ethical issues to arise as humans turn over to algorithms their cars, homes and businesses. These issues and their consequences will not discriminate, and the impact will be far-reaching — affecting everyone, including public citizens, small businesses utilizing AI or entrepreneurs developing the latest tech. No one will be left untouched. I am aware of a few existing initiatives focused on more research, best practices and collaboration; however, it’s clear that there’s much more work to be done. 

For the future of AI to become as responsible as possible, we’ll need to answer some tough ethical questions.

Researchers, entrepreneurs and global organizations must lay the groundwork for a code of AI ethics to guide us through these upcoming breakthroughs and inevitable dilemmas. I should clarify that this won’t be a single code of ethics — each company and industry will have to come up with their own unique guidelines.

For the future of AI to become as responsible as possible, we’ll need to answer some tough ethical questions. I do not have the answers to these questions right now, but my goal is to bring more awareness to this topic, along with simple common sense, and work toward a solution. Here are some of the issues related to AI and automation that keep me up at night.

The ethics of driverless cars

With the invention of the car came the invention of the car accident. Similarly, an AI-augmented car will bring with it ethical and business implications that we must be prepared to face. Researchers and programmers will have to ask themselves what safety and mobility trade-offs are inherent in autonomous vehicles.

Ethical challenges will unfold as algorithms are developed that impact how humans and autonomous vehicles interact. Should these algorithms be transparent? For example, will a car rear-end an abruptly stopped car or swerve and hit a dog on the side of the street? Key decisions will be made by a fusion processor in split seconds, running AI, connecting a car’s vast array of sensors. Will entrepreneurs and small businesses be kept in the dark while these algorithms dominate the market?

Driverless cars will also transform the way consumers behave. Companies will need to anticipate this behavior and offer solutions to fill those gaps. Now is the time to start predicting how this technology will change consumer needs and what products and services can be created to meet them.

The battle against fake news

As our news media and social platforms become increasingly AI-driven, businesses from startups to global powerhouses must be aware of their ethical implications and choose wisely when working this technology into their products.

We’re already seeing AI being used to create and defend against political propaganda and fake news. Meanwhile, dark money has been used for social media ads that can target incredibly specific populations in an attempt to influence public opinion or even political elections. What happens when we can no longer trust our news sources and social media feeds?

AI will continue to give algorithms significant influence over what we see and read in our daily lives. We have to ask ourselves how much trust we can put in the systems that we’re creating and how much power we can give them. I think it’s up to companies like Facebook, Google and Twitter — and future platforms — to put safeguards in place to prevent them from being misused. We need the equivalent of Underwriters Laboratories (UL) for news!

The future of the automated workplace

Companies large and small must begin preparing for the future of work in the age of automation. Automation will replace some labor and enhance other jobs. Many workers will be empowered with these new tools, enabling them to work more quickly and efficiently. However, many companies will have to account for the jobs lost to automation.

Businesses should begin thinking about what labor may soon be automated and how their workforce can be utilized in other areas. A large portion of the workforce will have to be trained for new jobs created by automation in what is becoming commonly referred to as collaborative automation. The challenge will come when deciding on how to retrain and redistribute employees whose jobs have been automated or augmented. Will it be the government, employers or automation companies? In the end, these sectors will need to work together as automation changes the landscape of work.

No one will be left untouched.

It’s true that AI is the next stage of tech evolution, and that it’s everywhere. It has become portable, accessible and economical. We have now, finally, reached the AI tipping point. But that point is on a precarious edge, see-sawing somewhere between an AI dreamland and an AI nightmare.

In order to surpass the AI hype and take advantage of its transformative powers, it’s essential that we get AI right, starting with the ethics. As entrepreneurs rush to develop the latest AI tech or use it to solve key business problems, each has a responsibility to consider the ethics of this technology. Researchers, governments and businesses must cooperatively develop ethical guidelines that help to ensure a responsible use of AI to the benefit of all.

From driverless cars to media platforms to the workplace, AI is going to have a significant impact on how we live our lives. But as AI thought leaders and experts, we shouldn’t just deliver the technology — we need to closely monitor it and ask the right questions as the industry evolves.

It has never been a more exciting time to be an entrepreneur in the rise of AI, but there’s a lot of work to be done now and in the future to ensure we’re using the technology responsibly.

 


0

Inside Atari’s rise and fall

17:30 | 21 June

Jamie Lendino Contributor
Jamie Lendino is the editor-in-chief of Extreme Tech

By the first few months of 1982, it had become more common to see electronics stores, toy stores, and discount variety stops selling 2600 games. This was before Electronics Boutique, Software Etc., and later, GameStop . Mostly you bought games at stores that sold other electronic products, like Sears or Consumer Distributors. Toys ’R’ Us was a big seller of 2600 games. To buy one, you had to get a piece of paper from the Atari aisle, bring it to the cashier, pay for it, and then wait at a pickup window behind the cash register lanes.

Everyone had a favorite store in their childhood; here’s a story about one of mine. A popular “destination” in south Brooklyn is Kings Plaza, a giant (for Brooklyn) two-story indoor mall with about 100 stores. My mother and grandmother were avid shoppers there. To get to the mall from our house, it was about a 10-minute car service ride. So once a week or thereabouts, we’d all go. The best part for me was when we went inside via its Avenue U entrance instead of on the Flatbush Avenue side. Don’t ask me what went into this decision each time; I assume it depended on the stores my mother wanted to go to. All I knew was the Avenue U side had this circular kiosk maybe 50 feet from the entrance. The name has faded from memory. I remember it was a kind of catch-all for things like magazines, camera film, and other random stuff.

But the most important things were the Atari cartridges. There used to be dozens of colorful Atari game boxes across the wall behind the counter. When we walked up to the cashier’s window, there was often a row of new Atari games across the top as well. Sometimes we left without a new cartridge, and sometimes I received one. But we always stopped and looked, and it was the highlight of my trip to the mall each time.

For whatever reason, I remember the guy behind the counter gave me a hard time one day. I bought one of Atari’s own cartridges—I no longer remember which, but I’m almost sure it was either Defender or Berzerk—that came with an issue of Atari Force, the DC comic book. I said I was excited to get it. The guy shot me a dirty look and said, “You’re buying a new Atari cartridge just for a comic book?” I was way too shy to argue with him, even though he was wrong and I wanted the cartridge. I don’t remember what my mother said, or if she even heard him. Being too shy to protest, I sheepishly took my game and we both walked away.

Mattel Stumbles, While Atari Face-Plants

Mattel began to run into trouble with its Intellivision once the company tried to branch out from sports games. Because Mattel couldn’t license properties from Atari, Nintendo, or Sega, it instead made its own translations of popular arcade games. Many looked better than what you’d find on the 2600, but ultimately played more slowly thanks to the Intellivision’s sluggish CPU. Perhaps the most successful was Astrosmash, a kind of hybrid of Asteroids and Space Invaders, where asteroids, space ships, and other objects fell from the sky and became progressively more difficult. Somewhat less successful were games like Space Armada (a Space Invaders knock off).

Mattel also added voice synthesis—something that was all the rage at the time—to the Intellivision courtesy of an add-on expansion module called Intellivoice. But only a few key games delivered voice capability: Space Spartans, Bomb Squad, B-17 Bomber (all three were launch titles), and later, Tron: Solar Sailer. The Intellivoice’s high cost, lack of a truly irresistible game, and overall poor sound quality meant this was one thing Atari didn’t have to find a way to answer with the 2600.

These events made it easier for Atari to further pull away from Mattel in the marketplace, and it did so—but not without a tremendous self-inflicted wound. A slew of new 2600 games arrived in the first part of 1982. Many important releases came in this period and those that followed, and we’ll get to those shortly. But there was one in particular that the entire story arc of the platform balanced on, and then fractured. It was more than a turning point; its repercussions reverberated throughout the then-new game industry, and to this day it sticks out as one of the key events that ultimately did in Atari.

Pac-Man (Atari, March 1982)

The single biggest image-shattering event for the 2600—and Atari itself—was the home release of its Pac-Man cartridge. I can still feel the crushing disappointment even now. So many of my friends and I looked forward to this release. We had talked about it all the time in elementary school. Pac-Man was simply the hottest thing around in the arcades, and we dreamed of playing it at home as much as we wanted. The two-year wait for Atari to release the 2600 cartridge seemed like forever. Retailers bought into the hype as well. Toy stores battled for inventory, JC Penney and Kmart bought in big along with Sears and advertised on TV, and even local drug stores started stocking the game. And yet, what we got…wasn’t right.

Just about everyone knows how Pac-Man is supposed to work, but just in case: You gobble up dots to gain points while avoiding four ghosts. Eat a power pellet, and you can turn the tables on the ghosts, chase them down, and eat them. Each time you do so, the “eyes” of the ghost fly back to the center of the screen and the ghost regenerates. Eat all the dots and power pellets on the screen, and you progress to the next one, which gets harder. Periodically, a piece of fruit appears at the center of the screen. You can eat it for bonus points, and the kind of fruit denotes the level you are on (cherry, strawberry, orange, and so on).

But that’s not the game Atari 2600 owners saw. After securing the rights to the game from Namco, Atari gave programmer Tod Frye just five weeks to complete the conversion. The company had learned from its earlier mistakes and promised Frye a royalty on every cartridge manufactured (not sold), which was an improvement. But this was another mistake. The royalty plus the rushed schedule meant Frye made money even if the game wasn’t up to snuff, and thus Frye had incentive to complete it regardless. Atari also required the game to fit into just 4KB like older 2600 cartridges, rather than the newer 8KB size that was becoming much more common by this point. That profit-driven limitation heavily influenced the way Frye approached the design of the game. To top it all off, Atari set itself up for a colossal failure by producing some 12 million cartridges, even though there were only 10 million 2600 consoles in circulation at the time. The company was confident that not only would every single existing 2600 owner buy the game, but that 2 million new customers would buy the console itself just for this cartridge.

We all know how it turned out. The instruction manual sets the tone for the differences from the arcade early on. The game is now set in “Mazeland.” You eat video wafers instead of dots. Every time you complete a board, you get an extra life. The manual says you also earn points from eating power pills, ghosts, and “vitamins.” Something is definitely amiss.

Pac-Man himself always looks to the right or left, even if he is going up or down. The video wafers are long and rectangular instead of small, square dots. Fruits don’t appear periodically at the center of the screen. Instead, you get the aforementioned vitamin, a clear placeholder for what would have been actual fruit had there been more time to get it right. The vitamin always looks the same and is always worth 100 points, instead of increasing as you clear levels. The rest of the scoring is much lower than it is in the arcade. Gobbling up all four ghosts totals just 300 points, and each video wafer is worth just 1 point.

The ghosts have tremendous amounts of flicker, and they all look and behave identically, instead of having different colors, distinct personalities, and eyes that pointed in the right direction. The flicker was there for a reason. Frye used it to draw the four ghosts in successive frames with a single sprite graphic register, and drew Pac-Man every frame using the other sprite graphic register. The 2600’s TIA chip synchronizes with an NTSC television picture 60 times per second, so you end up seeing a solid Pac-Man, maze, and video wafers (I can still barely type “video wafers” with a straight face), but the ghosts are each lit only one quarter of the time. A picture tube’s phosphorescent glow takes a little bit to fade, and your eye takes a little while to let go of a retained image as well, but the net result is that the flicker is still quite visible.

It gets worse. The janky, gritty sound effects are bizarre, and the theme song is reduced to four dissonant chords. (Oddly, these sounds resurfaced in some movies over the next 20 years and were a default “go-to” for sound designers working in post-production.) The horizontally stretched maze is nothing like the arcade, either, and the escape routes are at the top and bottom instead of the sides. The maze walls aren’t even blue; they’re orange, with a blue background, because it’s been reported Atari had a policy that only space games could have black backgrounds (!). At this point, don’t even ask about the lack of intermissions.

One of Frye’s own mistakes is that he made Pac-Man a two-player game. “Tod used a great deal of memory just tracking where each player had left off with eaten dots, power pellets, and score,” wrote Goldberg and Vendel in Atari Inc.: Business is Fun. Years later, when Frye looked at the code for the much more arcade-faithful 2600 Ms. Pac-Man, he saw the programmers were “able to use much more memory for graphics because it’s only a one player game.”

Interestingly, the game itself is still playable. Once you get past the initial huge letdown and just play it on its own merits, Pac-Man puts up a decent experience. It’s still “Pac-Man,” sort of, even if it delivers a rough approximation of the real thing as if it were seen and played through a straw. It’s worth playing today for nostalgia—after all, many of us played this cartridge to death anyway, because it was the one we had—and certainly as a historical curiosity for those who weren’t around for the golden age of arcades.

Many an Atari 2600 fan turned on the platform—and Atari in general—after the release of Pac-Man. Although the company still had plenty of excellent games and some of the best were yet to come, the betrayal was immediate and real and forever colored what much of the gaming public thought of Atari. The release of the Pac-Man cartridge didn’t curtail the 2600’s influence on the game industry by any means; we’ll visit many more innovations and developments as we go from here on out. But the 2600 conversion of Pac-Man gave the fledgling game industry its first template for how to botch a major title. It was the biggest release the Atari 2600 had and would ever see, and the company flubbed it about as hard as it could. It was New Coke before there was New Coke.

Grand Prix (Activision, March 1982)

The next few games we’ll discuss further illustrate the quality improvements upstart third-party developers delivered, in comparison with Atari, which had clearly become too comfortable in its lead position. First up is Activision’s Grand Prix, which in hindsight was a bit of an odd way to design a racer . It’s a side-scroller on rails that runs from left to right, and is what racing enthusiasts call a time trial. Although other computer-controlled cars are on the track, you’re racing against the clock, not them, and you don’t earn any points or increase your position on track for passing them.

Gameplay oddities aside, the oversized Formula One cars are wonderfully detailed, with brilliant use of color and animated spinning tires. The shaded color objects were the centerpiece of the design, as programmer David Crane said in a 1984 interview. “When I developed the capability for doing a large multicolored object on the [2600’s] screen, the capability fitted the pattern of the top view of a Grand Prix race car, so I made a racing game out of it.” Getting the opposing cars to appear and disappear properly as they entered and exited the screen also presented a problem, as the 2600’s lack of a frame buffer came into play again. The way TIA works, the 2600 would normally just make the car sprite begin to reappear on the opposite side of the screen as it disappeared from one side. To solve this issue, Crane ended up storing small “slices” of the car in ROM, and in real time the game drew whatever portions of the car were required to reach the edge of the screen. The effect is smooth and impossible to detect while playing.

The car accelerates over a fairly long period of time, and steps through simulated gears. Eventually it reaches a maximum speed and engine note, and you just travel along at that until you brake, crash into another car, or reach the finish line. As the manual points out, you don’t have to worry about cars coming back and passing you again, even if you crash. Once you pass them, they’re gone from the race.

The four game variations in Grand Prix are named after famous courses that resonate with racing fans (Watkins Glen, Brands Hatch, Le Mans, and Monaco). The courses bear no resemblance to the real ones; each game variation is simply longer and harder than the last. The tree-lined courses are just patterns of vehicles that appear on screen. Whenever you play a particular game variation, you see the same cars at the same times (unless you crash, which disrupts the pattern momentarily). The higher three variations include bridges, which you have to quickly steer onto or risk crashing. During gameplay, you get a warning in the form of a series of oil slicks that a bridge is coming up soon.

Although Atari’s Indy 500 set the bar early for home racing games on the 2600, Grand Prix demonstrated you could do one with a scrolling course and much better graphics. This game set the stage for more ambitious offerings the following year. And several decades later, people play games like this on their phones. We just call titles like Super Mario Run (a side-scroller) and Temple Run (3D-perspective) “endless runners,” as they have running characters instead of cars.

Activision soon became the template for other competing third-party 2600 developers. In 1981, Atari’s marketing vice president and a group of developers, including the programmers for Asteroids and Space Invaders on the console, started a company called Imagic. The company had a total of nine employees at the outset. Its name was derived from the words “imagination” and “magic”—two key components of every cartridge the company planned to release. Imagic games were known for their high quality, distinctive chrome boxes and labels, and trapezoidal cartridge edges. As with Activision, most Imagic games were solid efforts with an incredible amount of polish and were well worth purchasing.

Although Imagic technically became the second third-party developer for the 2600, the company’s first game didn’t arrive until March 1982. Another company, Games by Apollo, beat it to the punch by starting up in October 1981 and delivering its first (mediocre) game, Skeet Shoot, before the end of the year.

But when that first Imagic game did arrive, everyone noticed.

Demon Attack

At first glance, the visually striking Demon Attack looks kind of like a copy of the arcade game Phoenix, at least without the mothership screen (something it does gain in the Intellivision port). But the game comes into its own the more you play it. You’re stuck on the planet Krybor. Birdlike demons dart around and shoot clusters of lasers down toward you at the bottom of the screen. Your goal is to shoot the demons all out of the sky, wave after wave.

The playfield is mostly black, with a graded blue surface of the planet along the bottom of the screen. A pulsing, beating sound plays in the background. It increases in pitch the further you get into each level, only to pause and then start over with the next wave. The demons themselves are drawn beautifully, with finely detailed, colorful designs that are well animated and change from wave to wave. Every time you complete a wave, you get an extra life, to a maximum of six.

On later waves, the demons divide in two when shot, and are worth double the points. You can shoot the smaller demons, or just wait—eventually each one swoops down toward your laser cannon, back and forth until it reaches the bottom of the screen, at which point it disappears from the playfield. Shoot it while it’s diving and you get quadruple points. In the later stages, demons also shoot longer, faster clusters of lasers at your cannon.

The game is for one or two players, though there’s a cooperative mode that lets you take turns against the same waves of demons. There are also variations of the game that let you shoot faster lasers, as well as tracer shots that you can steer into the demons. After 84 waves, the game ends with a blank screen, though reportedly a later run of this cartridge eliminates that and lets you play indefinitely. If I were still nine years old, I could probably take a couple of days out of summer and see if this is true. I am no longer nine years old.

Demon Attack was one of Imagic’s first three games, along with Trick Shot and Star Voyager. Rob Fulop, originally of Atari fame and one of Imagic’s four founders, programmed Demon Attack. In November 1982, Atari sued Imagic because of Demon Attack’s similarity to Phoenix, the home rights of which Atari had purchased from Centuri. The case was eventually settled. Billboard magazine listed Demon Attack as one of the 10 best-selling games of 1982. It was also Imagic’s best-selling title, and Electronic Games magazine awarded it Game of the Year.

“The trick to the Demon Attack graphics was it was the first game to use my Scotch-taped/rubber-banded dedicated 2600 sprite animation authoring tool that ran on the Atari 800,” Fulop said in 1993. “The first time Michael Becker made a little test animation and we ran Bob Smith’s utility that successfully squirted his saved sprite data straight into the Demon Attack assembly code and it looked the same on the [2600] as it did on the 800 was HUGE! Before that day, all 2600 graphics ever seen were made using a #2 pencil, a sheet of graph paper, a lot of erasing, and a list of hex codes that were then retyped into the source assembly code, typically introducing a minimum of two pixel errors per eight-by-eight graphic stamp.”

Although you can draw a line from Space Invaders to just about any game like this, Demon Attack combines that with elements of Galaga and Phoenix, with a beautiful look and superb gameplay all its own.

Pitfall! (Activision, April 1982)

A watershed moment in video game history, David Crane’s Pitfall! was one of the best games released for the 2600. As Pitfall Harry, your goal is to race through the jungle and collect 32 treasures—money bags, silver bars, gold bars, and diamond rings, worth from 2,000 to 5,000 points each. Jump and grab vines, and you soar over lakes, quicksand, and alligators, complete with a Tarzan-style “yell.” You can stumble on a rolling log or fall into a hole, both of which just dock you some points. Each time you fall into quicksand or a tar pit, drown in a lake, burn in a fire, or get eaten by an alligator or scorpion, you lose a life. When that happens, you start the next one by dropping from the trees on the left side of the screen to keep playing.

Pushing the joystick left or right makes Pitfall Harry run. He picks up treasure automatically. Holding the stick in either direction while pressing the button makes him jump, either over an obstacle or onto a swinging vine (running into the vine without jumping also works). Push down while swinging to let go of the vine. You also can push up or down to climb ladders.

In an incredible feat of programming, the game contains 255 screens, with the 32 treasures scattered throughout them. The world loops around once you reach the last screen. Although Adventure pioneered the multiroom map on the 2600, Pitfall! was a considerably larger design. Crane fit the game into the same 4KB ROM as Adventure. But rather than storing all 255 screens as part of the ROM—which wouldn’t have fit—Crane’s solution was not to store the world in ROM at all. Instead, the world is generated by code, the same way each time. This is similar to games like Rogue, but even in that case, the game generates the world and then stores it during play. Pitfall! generates each screen via an algorithm, using a counter that increments in a pseudorandom sequence that is nonetheless consistent and can be run forwards or backwards. The 8 bits of each number in the counter sequence define the way the board looks. Bits 0 through 2 are object patterns, bits 3 through 5 are ground patterns, bits 6 and 7 cover the trees, and bit 7 also affects the underground pattern. This way, the world is generated the same way each and every single time. When you leave one screen, you always end up on the same next screen.

“The game was a jewel, a perfect world incised in a mere [4KB] of code,” Nick Montfort wrote in 2001 in Supercade: A Visual History of the Videogame Age, 1971-1984.

You get a total of three lives, and Crane points out in the manual that you need to use some of the underground passages (which skip three screens ahead instead of one) to complete the game on time. The inclusion of two on-screen levels—above ground and below ground, with ladders connecting them—makes the game an official platformer. And the game even gives you some say in where to go and what path you take to get there. Pitfall Harry is smoothly animated, and the vines deliver a genuine sensation of swinging even though the game is in 2D.

The game’s 20-minute timer, which approximates the 22-minute length of a standard half-hour television show, marked a milestone for console play. It was much longer than most arcade games and even cartridges like Adventure, which you could complete in a few minutes. The extra length allows for more in-depth play.

“Games in the early ’80s primarily used inanimate objects as main characters,” Crane said in a 2011 interview. “Rarely there would be a person, but even those weren’t fully articulated. I wanted to make a game character that could run, jump, climb, and otherwise interact with an on-screen world.” Crane spent the next couple of years tinkering with the idea before finally coming up with Pitfall!. “[After] only about 10 minutes I had a sketch of a man running on a path through the jungle collecting treasures. Then, after ‘only’ 1,000 hours of pixel drawing and programming, Pitfall Harry came to life.”

Crane said he had already gone beyond that 4KB ROM limit and back within it many times over hundreds of hours. Right before release, he was asked to add additional lives. “Now I had to add a display to show your number of lives remaining, and I had to bring in a new character when a new life was used.” The latter was easy, Crane said, because Pitfall Harry already knew how to fall and stop when he hit the ground. Crane just dropped him from behind the tree cover. “For the ‘Lives’ indicator I added vertical tally marks to the timer display. That probably only cost 24 bytes, and with another 20 hours of ‘scrunching’ the code I could fit that in.”

Pitfall! couldn’t have been timed more perfectly, as Raiders of the Lost Ark was the prior year’s biggest movie. The cartridge delivered the goods; it became the best-selling home video game of 1982 and it’s often credited as the game that kickstarted the platformer genre. Pitfall! held the top spot on Billboard’s chart for 64 consecutive weeks. “The fine graphic sense of the Activision design team greatly enriches the Pitfall! experience,” Electronic Games magazine wrote in January 1983, on bestowing the cartridge Best Adventure Videogame. “This is as richly complex a video game as you’ll find anywhere…Watching Harry swing across a quicksand pit on a slender vine while crocodiles snap their jaws frantically in a futile effort to tear off a little leg-of-hero snack is what video game adventures are all about.” Pitfall!’s influence is impossible to overstate. From Super Mario Bros. to Prince of Persia to Tomb Raider, it was the start of something huge.

 


0

The best home Wi-Fi and networking gear

17:30 | 20 June

Makula Dunbar Contributor
Makula Dunbar is a writer with Wirecutter.

Editor’s note: This post was done in partnership with Wirecutter. When readers choose to buy Wirecutter’s independently chosen editorial picks, Wirecutter and TechCrunch earn affiliate commissions.

It’s safe to say that for many, a world without internet is hard to imagine. When you need a solid internet connection for work, studying or for catching up on your favorite shows, having a poor connection is almost as bad.

While fast and reliable internet starts with having a good provider, owning the right gear helps to support it. From network storage to the best router, we’ve compiled picks for helping you set up and secure a dependable home Wi-Fi network.

Wi-Fi router: Netgear R7000P Nighthawk

More than anything, when it comes to setting up a home Wi-Fi network, it’s important to have a good Wif-Fi router. If you don’t rent one from your internet service provider, you’ll want one that’s easy to use, has a decent connection range, and that’s able to handle a crowded network.

Our top pick, the Netgear R7000P Nighthawk, is a dual-band, three stream 802.11ac router and it offers solid speed and throughput performance across long and moderate ranges. Its load-balancing band steering automatically kicks in when networks are busy, which means you won’t sit around clicking refresh and resending requests. We like that its toggles and features are easy to find. This router is ideal for larger spaces, including homes that experience coverage issues.                                                                                                           

Photo: Kyle Fitzgerald

Cable modem: Netgear CM500

If you have cable internet, a cable modem is one of the pieces of equipment that’s supplied by your service provider. As with a router, you may be paying an additional fee outside of the cost of internet service to have one. While a router helps your wireless devices communicate and use an internet connection, a modem is the device that connects your home network to the wider internet.

If you plan on, or have recently opted out of renting a modem, you’ll like that our top recommendation, the Netgear CM500, pays for itself in about six months. It’s compatible with a most cable Internet service providers, it’ll last for years, and you can rely on it to support Internet plan speeds of up to 300 Mbps.

Photo: Michael Hession

Wi-Fi mesh-networking kit: Netgear Orbi RBK50

In some homes, it isn’t uncommon to try moving a router around to find the best signal. However, with spaces larger than 2,000 square feet — or small to large spaces with brick, concrete, or lath-and-plaster interior walls — relocating a router may not do the trick. Instead of just using a single router, a Wi-Fi mesh-networking kit uses multiple access points improving overall Wi-Fi performance and range.

Our top pick, the Netgear Orbi RBK50, comes with a base router and satellite, each unit a tri-band device. We think these two units are enough for supporting a solid home Wi-Fi network in most spaces, but you can add another unit to this kit if necessary. It’s equipped with more than enough Ethernet ports, and it’ll work without an internet connection during setup or an internet outage.

Photo: Michael Hession

Wi-Fi extender: TP-Link RE200

For an even simpler, budget-friendly solution to bolstering a home Wif-Fi signal, we recommend the dual-band TP-Link RE200, our top pick for Wi-Fi extenders. It doesn’t extend the range of a network per se, but it increases throughput and decreases latency for a better Wi-Fi experience. It should be paired with a good router and is best for improving the speed of one device at a time over a network that isn’t too busy.

It’s compact, plugs into a power outlet and has an Ethernet port for easily connecting nearby devices. It’s a helpful middle-ground option for strengthening a Wi-Fi connection when you don’t need a mesh-networking kit and already have a decent router that doesn’t need to be replaced.                

Photo: Rozette Rago

VPN service: IVPN

A virtual private network, or VPN, helps to ensure that a connection is secure. It’s an extra layer of protection that encrypts your online activity and should be used in addition to password managers, browser plug-ins that support privacy and encrypted hardware. While network security is necessary, a service that’s transparent and trustworthy with helpful support is more important.

Out of the 12 VPN services that we tested, we think IVPN is the best provider. IVPN doesn’t log or monitor activity, and we like that it’s stable, fast and works across platforms. In situations where your network isn’t protected, or when you join an unsecure network, its “firewall” and OpenVPN protocol features will keep you covered.

Photo: Kyle Fitzgerald

NAS for most home users: Synology DS218+

For homes and spaces where multiple computers are used, network-attached storage (NAS) devices back up data and files to devices on one local network, or a cloud service. It’s a small computer equipped with one or two hard drive bays, and it always stays on using less power than a repurposed computer. Our top recommendation, the Synology DS218+, is easy to manage, and it has three USB ports and the fastest writing speeds of any NAS we tested.

For those with a massive database of files, a NAS device is a better option than an external drive. With the DS218+, you can play back media and conveniently access your data as it supports FTP protocol, VPN server capabilities, SSDs and more. The biggest plus with owning a NAS device is having the option to use it as a storage device, website-hosting device, a media streamer, or anything else that a Linux computer can function as.

This guide may have been updated by Wirecutter.

Note from Wirecutter: When readers choose to buy our independently chosen editorial picks, we may earn affiliate commissions that support our work.

 


0

Why startups can’t afford to ignore customer retention

00:00 | 20 June

Venture-backed companies must walk the line between fast growth and efficient growth. Even as VCs value high-quality revenue, companies are still held to a minimum growth rate. We think of this threshold as the “Mendoza Line,” a baseball term we’ve adapted to track the minimum growth needed to get access to venture funding. Above this line, startups are generally attractive to investors and even have a good chance for a strong exit.

To achieve sustainable growth, maximizing customer lifetime value is an important component and one that is often underestimated, particularly for SaaS and other subscription-based businesses that generate recurring revenue. It is estimated to cost somewhere between five to 25 times more to acquire a new customer than to keep one you already have. Additionally, Bain research has shown that a five percent increase in retention rates can increase profits by 25 to 95 percent. Even by conservative estimates, retention is a powerful mechanism for growth.

As companies face greater pressure to grow both quickly and responsibly, we are placing more value on customer retention as a barometer for long-term success. And we are seeing smart startups invest in measuring customer happiness in more sophisticated and consistent ways.

In looking at SaaS deals over the past 10 years, we’ve found that a few key metrics and best practices are predictive of healthy business fundamentals. Here’s the advice I give startups looking to achieve smart growth through customer retention.

Create a system for measuring customer happiness

First, measurement must be an executive priority. Ensure you have a system in place to measure retention on a quarterly basis (at least) and meet as an executive team to diagnose potential problems. While benchmarking against similar businesses can be helpful, trending your own metrics is the best way to see how your performance is improving or deteriorating.

You’ll need to identify the specific metrics that work best for your business. I recommend looking at how efficiently you’re putting resources toward customer retention, which gives you insight into customer happiness and predicts the profitability of your growth.

The percent of ARR spent on retention tells you how much you’re spending to keep your customers happy; let’s call it your Retention Efficiency. You can measure this with a simple calculation:

(Quarterly cost of customer retention) x 4
Ending annual recurring revenue (ARR) base

The ability to keep this number low means you’re retaining your customers without burning money. This means you can invest sales resources toward acquiring net new customers rather than replacing revenue from those that have left.

I’d also recommend looking at the Customer Retention Cost (CRC), which measures how much on average you’re spending to retain each customer:

(Quarterly cost of customer retention) x 4
Total # of customers in your base

Note, this number may increase over time if you’re moving upmarket — enterprise customers generally require more resources to retain than small to mid-sized companies. If your retention costs are going up, this per-customer number can help you explain why in the context of your go-to-market strategy.

Don’t just measure churn rate

Most startups measure retention in terms of churn rate: dollars that left in a given quarter divided by total ARR. In my experience, churn is a vanity metric and not particularly accurate because it combines customers that are eligible to leave and those that are not (e.g. contracts that were signed in the past month).

Renewal rate is harder to benchmark, but tells you more about your customer happiness and health of the business overall. Gross Renewal Rate shows you the dollars that renewed as a percentage of all dollars that were eligible to be renewed. Calculate this metric (Gross Renewal Rate) by summing all renewed contracts and dividing that total by the dollars that were up for renewal:

Dollars renewed
Dollars eligible to renew

Net Renewal Rate is a measurement of the growth of your existing customer base, net of any churn, as a percentage of all dollars that were eligible to renew. Include any expansion dollars with your renewed dollars in your calculation to get Net Renewal Rate:

(Dollars renewed + dollars expanded)
Dollars eligible to renew

Calculating renewal rate by segment is even more helpful in diagnosing issues of customer dissatisfaction. For instance, if your renewal rates are trending down in the SMB segment but not at the enterprise level, you might identify a problem with product-segment fit. Perhaps the product is too complex for SMB customers, while enterprise customers need those features.

Don’t look to customer success as the fix-all solution

If you’re looking to improve retention, the answer isn’t necessarily to pour resources into your customer success organization. Retention is one area that can be impacted by several functions. Look into the factors that play into customer lifetime value, including:

  • Product: Increases in churn or retention costs could signal that you’re drifting from product-market fit or that your product faces increased competitive pressure.
  • Marketing and sales: Ask yourself the following: Does your marketing accurately message your value proposition? How much is your sales team promising above and beyond what the product can do?
  • Customer success: Make sure you’re engaging with customers beyond the first three months of their deployment; the next six to nine months are critical for success. Measure customer success throughout the life cycle to ensure users are getting the most out of the product and understand how to use it.

Define a product engagement metric

Understanding how much your customers actually use and depend on your product is the best indicator of happiness. Engaged customers are more likely to renew their contract — which helps to keep your retention numbers steady. They’re also more likely to tell others about their experience with your product, which improves top-line growth.

Experiment with an engagement metric that works for your business: for DocuSign, it’s the number of envelopes sent; for JFrog, it’s the volume of binaries distributed; for Textio, it’s the number of job requisitions written in the platform.

Your ability to keep customers happy without spending a ton of resources speaks to the value you’re delivering. And if you retain customers efficiently, you can spend more on acquiring new customers. In evaluating a portfolio company, I’d much rather see a business with good growth and high-quality customer retention than one with explosive growth but low retention. VCs will hold you to these metrics — make sure you’re accountable for them.

 


0

Breaking down France’s new $76M Africa startup fund

08:30 | 18 June

Jake Bright Contributor
Jake Bright is a writer and author in New York City. He is co-author of The Next Africa.

Weeks after French President Emmanuel Macron unveiled a $76M African startup fund at VivaTech 2018, TechCrunch paid a visit to the French Development Agency (AFD) — who will administer the new fund — to get more details on how le noveau fonds will work.

The $76M (or €65M) will divvy up into three parts, according to AFD Digital Task Team Leader Christine Ha.

“There are €10M [$11.7M] for technical assistance to support the African ecosystem… €5M will be available as interest free loans to high potential, pre seed startups…and…€50M [$58M] will be for equity-based investments in series A to C startups,” explained Ha during a meeting in Paris.

The technical assistance will distribute in the form of grants to accelerators, hubs, incubators, and coding programs. The pre-seed startup loans will issue in amounts up to $100K “as early, early funding to allow entrepreneurs to prototype, launch, and experiment,” said Ha.

The $58M in VC startup funding will be administered through Proparco, a development finance institution—or DFI—partially owned by the AFD. The money will come “from Proparco’s balance sheet”…and a portion “will be invested in VC funds active on the continent,” said Ha.

Proparco already invests in Africa focused funds such as TLcom Capital and Partech Ventures. “Proparco will take equity stakes, and will be a limited partner when investing in VC funds,” said Ha.

Startups from all African countries can apply for a piece of the $58M by contacting any of Proparco’s Africa offices (including in Casablanca, Abidjan, Douala, Lagos, Nairobi, Johannesburg).

And what will AFD (and Proparco) look for in African startup candidates? “We are targeting young and innovative companies able to solve problems in terms of job creation, access to financial services, energy, health, education and affordable goods and services…[and] able to scale up their venture on the continent,” said Ha.

The $11.7M technical assistance and $5.8M loan portions of France’s new fund will be available starting 2019. On implementation, AFD is still “reviewing several options…such as relying on local actors through [France’s] Digital Africa platform,” said Ha.

Digital Africa­—a broader French government initiative to support the African tech ecosystem—will launch a new online platform in November 2018 with resources for startup entrepreneurs.

So that’s the skinny on France’s new Africa fund. It adds to a load of VC announced for the continent in less than 15 months, including $70 for Partech Ventures, TPG Growth’s $2BN Rise Fund, and $40M at TLcom Capital

Though $75M (and these other amounts) may pale compared to Silicon Valley VC values, it’s a lot for a startup scene that — at rough estimate—attracted only $400M four years ago.  African tech entrepreneurs, you now have a lot more global funding options, including from France.

 


0

Blockchain technology could be the great equalizer for American cities

01:30 | 18 June

The city of Austin is currently piloting a program in which its 2,000 homeless residents will be given a unique identifier that’s safely and securely recorded on the blockchain. This identifier will help individuals consolidate their records and seek out crucial services. Service providers will also be able to access the information. If successful, we’ll have a new, more efficient way to communicate and ensure that the right people are at the table to help the homeless.

in Austin and around the country, it seems that blockchain technology is opening a range of opportunities for city service delivery and operations.

At its core, blockchain is a secure, inalterable electronic register. Serving as a shared database or distributed ledger, it is located permanently online for anything represented digitally, such as rights, goods and property. Through enhanced trust, consensus and autonomy, blockchain brings widespread decentralization to transactions.

At the municipal level, blockchain has the potential to create countless smart networks and grids, altering how we do everything from vote and build credit to receive energy. In many ways, it could be a crucial component of what is needed to circumvent outdated systems and build long-lasting solutions for cities.

AUSTIN, TX – APRIL 14: A homeless man stands outside in front of a colorful wall mural at the Flat Track Coffee Shop on Cesar Chavez Blvd on April 14, 2017, in Austin, Texas. Austin, the State Capital of Texas, the state’s second largest city, and home to South By Southwest, has been experiencing a bustling building boom based on government, tourism, and high tech business. (Photo by George Rose/Getty Images)

As Motherboard has previously reported, it’s a “rich getting richer” situation. But if it’s good enough for the wealthy, why can’t it be adequate to help the poorer, more vulnerable members of the population?

Consider, for a moment, that it might be a major player in the more inclusive future we’ve always wanted.

Arguably, we have a lot of work to do. According to new research, 43 percent of families struggle to afford basics like food and housing. These populations are perhaps the ones who stand to gain the most from blockchain, the Internet of Things (IoT) and the advent of smart cities — if done right.

Smart city technology is growing ever more common here in the US and around the world. Our research shows that 66% of cities have invested in some sort of smart city technological infrastructure that enables them to collect, aggregate and analyze real-time data to improve the lives of residents. Smart cities are already showing great promise in many ways to improve the lives of people who live in cities.

Take, for instance, electricity. With the help of blockchain, we can turn microgrids into a reality on a macro scale, enabling communities to more easily embrace solar power and other more sustainable sources, which in turn will result in fewer emissions and lower healthcare costs and rates of disease. But in the more immediate future, blockchain-enabled microgrids would allow consumers to join a power “exchange” in which they can sell their surplus energy. In many scenarios, the consumers’ bills would either significantly drop, or they’d earn money.

Then there’s the question of building credit. It should be no surprise that the poor are the most likely to have debt and unpaid bills and, therefore, bad credit. They are also the most likely to be “unbanked,” as in they don’t use banks at all. In fact, seven percent of Americans don’t use banks. But with blockchain, we can design an alternate way to build and track transactions.

And, of course, there is voting — an issue that, more than ever, is vital to a thriving democracy. The US has lower voter turnout than just about every other developed country. In fact, just over half of voting-age Americans voted in 2016. We don’t talk enough about how important civic engagement — and holding politicians accountable — is for making the playing field fairer. We do, however, talk about what it would be like to be able to email our votes from the comfort of our home computer or smartphone. While email isn’t nearly secure enough for selecting our leaders, being able to vote from home is something we could — and should — aim to do.

UNITED STATES – DECEMBER 11: Voters exit the polling station at the Jefferson County Courthouse in Birmingham, Ala., on Tuesday, Dec. 12, 2017, after voting in the special election to fill Jeff Sessions’ seat in the U.S. Senate. (Photo By Bill Clark/CQ Roll Call)

Blockchain is proving to be a secure enough system to make this a reality. The result could be more youth, communities of color and disabled voters “showing up” to the polls. These online polls would be more “hack proof” — another contemporary concern — and votes could be counted in real time. Imagine never again going to bed thinking one candidate had won a race but waking up to find it was actually someone else.

Where will we go next with blockchain and what can this powerful new tool do for cities? Our latest National League of Cities report, Blockchain in Cities, provides mayors and other local officials with some clues. The research not only explores how cities can use blockchain now, but also how it will be used in the future to enable technology like autonomous vehicles that can “talk” to each other. These types of use cases — plus existing opportunities from blockchain—could potentially be transformative for municipal operations.

Blockchain is far more than just cryptocurrency. In time, blockchain could turn American society on its head, and at the same time make our major institutions, and the places we live, more inclusive. Cities — and in some cases states — are the places where this will be piloted. By developing smarter cities and utilizing blockchain as a secure resource, city leaders can provide community members with the tools they need for success.

 


0

VCs serve up a large helping of cash to startups disrupting food

21:11 | 16 June

Here is what your daily menu might look like if recently funded startups have their way.

You’ll start the day with a nice, lightly caffeinated cup of cheese tea. Chase away your hangover with a cold bottle of liver-boosting supplement. Then slice up a few strawberries, fresh-picked from the corner shipping container.

Lunch is full of options. Perhaps a tuna sandwich made with a plant-based, tuna-free fish. Or, if you’re feeling more carnivorous, grab a grilled chicken breast fresh from the lab that cultured its cells, while crunching on a side of mushroom chips. And for extra protein, how about a brownie?

Dinner might be a pizza so good you send your compliments to the chef — only to discover the chef is a robot. For dessert, have some gummy bears. They’re high in fiber with almost no sugar.

Sound terrifying? Tasty? Intriguing? If you checked tasty and intriguing, then here is some good news: The concoctions highlighted above are all products available (or under development) at food and beverage startups that have raised venture and seed funding this past year.

These aren’t small servings of capital, either. A Crunchbase News analysis of venture funding for the food and beverage category found that startups in the space gobbled up more than $3 billion globally in disclosed investment over the past 12 months. That includes a broad mix of supersize deals, tiny seed rounds and everything in-between.

Spending several hours looking at all these funding rounds leaves one with a distinct sense that eating habits are undergoing a great deal of flux. And while we can’t predict what the menu of the future will really hold, we can highlight some of the trends. For this initial installment in our two-part series, we’ll start with foods. Next week, we’ll zero in on beverages.

Chickenless nuggets and fishless tuna

For protein lovers disenchanted with commercial livestock farming, the future looks good. At least eight startups developing plant-based and alternative proteins closed rounds in the past year, focused on everything from lab meat to fishless fish to fast-food nuggets.

New investments add momentum to what was already a pretty hot space. To date, more than $600 million in known funding has gone to what we’ve dubbed the “alt-meat” sector, according to Crunchbase data. Actual investment levels may be quite a bit higher since strategic investors don’t always reveal round size.

In recent months, we’ve seen particularly strong interest in the lab-grown meat space. At least three startups in this area — Memphis Meats, SuperMeat and Wild Type — raised multi-million dollar rounds this year. That could be a signal that investors have grown comfortable with the concept, and now it’s more a matter of who will be early to market with a tasty and affordable finished product.

Makers of meatless versions of common meat dishes are also attracting capital. Two of the top funding recipients in our data set include Seattle Food Tech, which is working to cost-effectively mass-produce meatless chicken nuggets, and Good Catch, which wants to hook consumers on fishless seafoods. While we haven’t sampled their wares, it does seem like they have chosen some suitable dishes to riff on. After all, in terms of taste, both chicken nuggets and tuna salad are somewhat removed from their original animal protein sources, making it seemingly easier to sneak in a veggie substitute.

Robot chefs

Another trend we saw catching on with investors is robot chefs. Modern cooking is already a gadget-driven process, so it’s not surprising investors see this as an area ripe for broad adoption.

Pizza, the perennial takeout favorite, seems to be a popular area for future takeover by robots, with at least two companies securing rounds in recent months. Silicon Valley-based Zume, which raised $48 million last year, uses robots for tasks like spreading sauce and moving pies in and out of the oven. France’s EKIM, meanwhile, recently opened what it describes as a fully autonomous restaurant staffed by pizza robots cooking as customers watch.

Salad, pizza’s healthier companion side dish, is also getting roboticized. Just this week, Chowbotics, a developer of robots for food service whose lineup includes Sally the salad robot, announced an $11 million Series A round.

Those aren’t the only players. We’ve put together a more complete list of recently launched or funded robot food startups here.

Beyond sugar

Sugar substitutes aren’t exactly a new area of innovation. Diet Rite, often credited as the original diet soda, hit the market in 1958. Since then, we’ve had 60 years of mass-marketing for low-calorie sweeteners, from aspartame to stevia.

It’s not over. In recent quarters, we’ve seen a raft of funding rounds for startups developing new ways to reduce or eliminate sugar in many of the foods we’ve come to love. On the dessert and candy front, Siren Snacks and SmartSweets are looking to turn favorite indulgences like brownies and gummy bears into healthy snack options.

The quest for good-for-you sugar also continues. The latest funding recipient in this space appears to be Bonumuse, which is working to commercialize two rare sugars, Tagatose and Allulose, as lower-calorie and potentially healthier substitutes for table sugar. We’ve compiled a list of more sugar-reduction-related startups here.

Where is it all headed?

It’s tough to tell which early-stage food startups will take off and which will wind up in the scrap bin. But looking in aggregate at what they’re cooking up, it looks like the meal of the future will be high in protein, low in sugar and prepared by a robot.

 


0

The problem with ‘explainable AI’

00:00 | 15 June

The first consideration when discussing transparency in AI should be data, the fuel that powers the algorithms. Companies should disclose where and how they got the data they used to fuel their AI systems’ decisions. Consumers should own their data and should be privy to the myriad ways that businesses use and sell such information, which is often done without clear and conscious consumer consent. Because data is the foundation for all AI, it is valid to want to know where the data comes from and how it might explain biases and counterintuitive decisions that AI systems make.

On the algorithmic side, grandstanding by IBM and other tech giants around the idea of “explainable AI” is nothing but virtue signaling that has no basis in reality. I am not aware, for instance, of any place where IBM has laid bare the inner workings of Watson — how do those algorithms work? Why do they make the recommendations/predictions they do?

There are two issues with the idea of explainable AI. One is a definition: What do we mean by explainability? What do we want to know? The algorithms or statistical models used? How learning has changed parameters throughout time? What a model looked like for a certain prediction? A cause-consequence relationship with human-intelligible concepts?

Each of these entail different levels of complexity. Some of them are pretty easy — someone had to design the algorithms and data models so they know what they used and why. What these models are, is also pretty transparent. In fact, one of the refreshing facets of the current AI wave is that most of the advancements are made in peer-reviewed papers — open and available to everyone.

What these models mean, however, is a different story. How these models change and how they work for a specific prediction can be checked, but what they mean is unintelligible for most of us. It would be like buying an iPad that had a label on the back explaining how a microprocessor and touchscreen works — good luck! And then, adding the layer of addressing human-intelligible causal relationships, well that’s a whole different problem.

Part of the advantage of some of the current approaches (most notably deep learning), is that the model identifies (some) relevant variables that are better than the ones we can define, so part of the reason why their performance is better relates to that very complexity that is hard to explain because the system identifies variables and relationships that humans have not identified or articulated. If we could, we would program it and call it software.

The second overarching factor when considering explainable AI is assessing the trade-offs of “true explainable and transparent AI.” Currently there is a trade-off in some tasks between performance and explainability, in addition to business ramifications. If all the inner workings of an AI-powered platform were publicly available, then intellectual property as a differentiator is gone.

Imagine if a startup created a proprietary AI system, for instance, and was compelled to explain exactly how it worked, to the point of laying it all out — it would be akin to asking that a company disclose its source code. If the IP had any value, the company would be finished soon after it hit “send.” That’s why, generally, a push for those requirements favor incumbents that have big budgets and dominance in the market and would stifle innovation in the startup ecosystem.

Please don’t misread this to mean that I’m in favor of “black box” AI. Companies should be transparent about their data and offer an explanation about their AI systems to those who are interested, but we need to think about the societal implications of what that is, both in terms of what we can do and what business environment we create. I am all for open source, and transparency, and see AI as a transformative technology with a positive impact. By putting such a premium on transparency, we are setting a very high burden for what amounts to an infant but high-potential industry.

 


0

Beware ‘founder-friendly’ VCs — 3 steps founders should take to protect their companies

23:00 | 12 June

In 2014, it seemed like pretty much anyone with a pulse and pitch deck was capable of raising huge amounts of capital from prestigious venture capital firms at sky-high valuations. Here we are four years later and times have changed. VCs inked a little more than 3,100 deals in the last quarter of 2017, according to Crunchbase — about 500 fewer than the previous quarter.

For aspiring startup founders, it’s a “confusing time in the so-called Unicorn story,” as Erin Griffith put it in a column last May — an asset bubble that never really popped, but which at the very least is deflating. In the confirmation hearing for new SEC Chairman Jay Clayton, lawmakers lamented the dearth of initial public offerings as companies that thrived in private markets — from Snap to Blue Apron — have struggled to deliver meaningful returns to investors.

This all creates a number of dilemmas for founders looking to raise capital and scale businesses in 2018. VCs remain an integral part of the innovation ecosystem. But what happens when the changing dynamics of financial markets collide with VCs’ expectations regarding growth? VCs may not always be aligned with founders and companies in this new environment. A recent study commissioned by Eric Paley at Founder Collective found that by pressuring companies to scale prematurely, venture capitalists are indirectly responsible for more startup deaths than founder infighting, technical debt and slow customer adoption — combined.

The new landscape requires that founders in particular be judicious in the way they seek out new sources of capital, structure cap tables and ownership and the types of concessions made to their new backers in exchange for that much-needed cash. Here are three ways founders can ensure they’re looking out for what’s best for their companies — and themselves — in the long run.

Take time to backchannel

Venture capitalists are arguably in the business of due diligence. Before they sign the dotted line, they can be expected to call your competitors, your customers, your former employers, your business school classmates — they will ask everyone and their mother about you.

It goes without saying that differences of opinion regarding your business strategy can lead to big conflict down the road.

A first-time founder is also new to the pressures of entrepreneurship, of having employees rely on you for their livelihoods. Whether you are desperate for cash because you need to make payroll, or you’re anxious for the validation of a headline-worthy investment, few founders take the time to properly backchannel their investors. Until you can say you’ve done due diligence of your own, your opinion of your VCs is going to be based on the size of their fund, the deals they’ve done or the press they’ve gotten. In short, it will likely be based on what they’ve done right.

On the other hand, you likely don’t know anything about the actual partner that will join your board. Are they intelligent in your space? Do they have a meaningful network? Or do they just know a few headhunters? Are they value creators? What is their political standing in their firm? Before you sign a term sheet, you need to take the time to contextualize the profile of the person who is taking a board seat. It gives you foresight on the actions your investment partner will likely take down the road.

Think beyond your first raise

If you do decide to raise capital, make sure you are in alignment with your board regarding your business plan, the pursuit of profit at the expense of revenue growth, or vice versa, and how it will steer your decision making as the market changes. It goes without saying that differences of opinion regarding your business strategy can lead to big conflict down the road.

As you think about these trade-offs, remember that as an entrepreneur, your obligation is to the existing shareholders: the employees and you. As the pack of potential unicorns has thinned, VCs in particular have turned to unconventional deal structures, like the use of common and preferred shares. For the founder who needs to raise cash, a dual ownership structure seems like a fair compromise to make, but remember that it may be at the expense of your employees’ option pool. The interests of preferred and common shareholders are not perfectly aligned, particularly when it comes time to make difficult decisions in the future.

Is VC money right for you?

VCs frequently share information, board decks and investor presentations with members of the press and the tech community, sometimes in support of their own personal agendas or to get perspective on whether to invest or not. That’s why it’s particularly important to backchannel, and more importantly, that you have allies that you can call on and people who can ensure some measure of goodwill. A good company board cannot be made up of just the investors and you: You need advocates that are balanced and on your side.

Venture capital is far from the only way to finance an early-stage business.

These prescriptions can sound paranoid, particularly to the founder whose business is growing nicely. But anything can cause a sea change and put you at odds with the people funding your company — who now own a piece of the company that you’re trying to build. When disagreements arise, it can get tense. They might say that you are a first-time founder, and therefore a novice. They will make your weaknesses known and say you’ll never be able to raise again if you ignore their invaluable advice. It’s important that you don’t fall into the fear trap. If you create a product or service that solves an undeniable problem, the money will come — and you will get funded again.

The term founder-friendly VC was always perhaps a bit of a misnomer. The people building the business and the people planning on cashing in on your efforts are imperfect allies. As a founder and business owner, your primary responsibilities are to your clients, to the company you’re building and, most importantly, to the employees who are helping you do it. As founders we like to think that we have all the answers, especially in bad times. Making sure you have alignment with your investors in challenging and unpredictable situations is critical. It’s important to anticipate how your investors will problem-solve before you give up control.

Venture capital is far from the only way to finance an early-stage business. Founders looking to jump-start their business have a number of alternatives, from debt financing and bootstrapping to crowdfunding, angel investors and ICOs. There are indeed still many advantages to having experienced investors on your side, not simply the cash but also the access to hiring and industry knowledge. But the relationship can only benefit both parties when founders go in eyes wide open.

 


0

Africa Roundup: African startup investments turn to fintech this winter season

09:30 | 11 June

Forty-seven and a half million dollars is a big commitment to African technology companies — even with the recent uptick in VC investment on the continent.

But for the Kenyan-based fintech firm Cellulant, whose digital payments platform processed 7 million transactions worth $350 million across 33 African countries in the last month alone, raising that amount in a series C round led by TPG Growth’s Rise Fund just makes sense.

In 2017, the company processed $2.7 billion in payments, said chief executive, Ken Njoroge.

Clients include the continent’s largest banks: Barclays Bank, Standard Chartered, Standard Bank, and Ecobank. Cellulant also has multiple revenue streams and is EBITDA positive, according to its CEO.

So what does an African technology company do with $47.5 million? “The round is to accelerate our growth of around 20 percent…north of 50 percent,” said Njoroge. “Most of the investment is to scale out our existing platform in Africa and build usage on our existing network.”

Founded in 2004, Cellulant offers Person-to-Business, B2B, and P2B services on its Mula and Tingg products. It’s also developing a blockchain based Agrikore product for agriculture related market activity.

On Africa’s digital payments potential, “We’ve built internal value models that estimate the size of the market at somewhere between $25BN and $40BN,” said Njoroge.

He differentiates Cellulant’s focus from Safaricom’s M-Pesa –one of Africa most recognized payment products — by transaction type and scope. “Kenya’s M-Pesa is optimized as a P2P platform in a few African countries. We’re optimized as a P2B platform and single pipe into multiple countries across Africa,” he said.

One of those countries is economic and population powerhouse Nigeria — where Cellulant offers both its both Ting and Agrikore apps. Nigeria is also home to notable digital payment companies Paga and Interswitch, the latter of which has expanded across Africa and is considered a candidate for a public offering.

On a future Cellulant initial public offering, “it’s too early,” said Njoroge. But he doesn’t rule it out. “When you look at the size of the payments business, you could say we have fairly strong prospects to go in that direction.”

TONY KARUMBA/AFP/Getty Images

Meanwhile, the Nigerian investment startup Piggybank.ng closed $1.1M in seed funding and announced a new product — Smart Target, which offers a more secure and higher return option for Esusu or Ajo group savings clubs common across West Africa.

The financing was led by a $1 million commitment from LeadPath Nigeria, with Village Capital and Ventures Platform joining the round.

Founded in 2016, Piggybank.ng offers online savings plans — primarily to low and middle-income Nigerians — for deposits of small amounts on a daily, weekly, monthly, or annual basis. There are no upfront fees.

Savers earn interest rates of between 6 to 10 percent, depending on the type and duration of investment, Piggybank.ng’s Somto Ifezue explained in this TechCrunch exclusive.

The startup generates returns for small-scale savers (primarily) through investment in Nigerian government securities, such as bonds and treasury bills.

Piggybank.ng generates revenue through asset management and from the float its balances generate at partner banks.

The Lagos based startup will use its $1.1M in new seed funding for “license acquisition and product development,” according to company COO Odunayo Eweniyi.

Piggybank.ng looks to grow clients across younger Nigerians and the country’s informal saving groups and has taken preliminary steps to launch in other African countries.

Lead investor and LeadPath Nigeria founder Olumide Soyombo was attracted to Piggybank.ng as an acquisition target.

“The banks have been slow to try new things in this savings space. Piggybank is coming in…and filling a particular need, so they are in a very acquisitive space.”

PIUS UTOMI EKPEI/AFP/Getty Images

More Africa Related Stories @TechCrunch

African Tech Around the Net

 


0
<< Back Forward >>
Topics from 1 to 10 | in all: 4200

Site search


Last comments

Walmart retreats from its UK Asda business to hone its focus on competing with Amazon
Peter Short
Good luck
Peter Short

Evolve Foundation launches a $100 million fund to find startups working to relieve human suffering
Peter Short
Money will give hope
Peter Short

Boeing will build DARPA’s XS-1 experimental spaceplane
Peter Short
Great
Peter Short

Is a “robot tax” really an “innovation penalty”?
Peter Short
It need to be taxed also any organic substance ie food than is used as a calorie transfer needs tax…
Peter Short

Twitter Is Testing A Dedicated GIF Button On Mobile
Peter Short
Sounds great Facebook got a button a few years ago
Then it disappeared Twitter needs a bottom maybe…
Peter Short

Apple’s Next iPhone Rumored To Debut On September 9th
Peter Short
Looks like a nice cycle of a round year;)
Peter Short

AncestryDNA And Google’s Calico Team Up To Study Genetic Longevity
Peter Short
I'm still fascinated by DNA though I favour pure chemistry what could be
Offered is for future gen…
Peter Short

U.K. Push For Better Broadband For Startups
Verg Matthews
There has to an email option icon to send to the clowns in MTNL ... the govt of India's service pro…
Verg Matthews

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short