Blog of the website «TechCrunch» Прогноз погоды

People

John Smith

John Smith, 49

Joined: 28 January 2014

Interests: No data

Jonnathan Coleman

Jonnathan Coleman, 32

Joined: 18 June 2014

About myself: You may say I'm a dreamer

Interests: Snowboarding, Cycling, Beer

Andrey II

Andrey II, 41

Joined: 08 January 2014

Interests: No data

David

David

Joined: 05 August 2014

Interests: No data

David Markham

David Markham, 65

Joined: 13 November 2014

Interests: No data

Michelle Li

Michelle Li, 41

Joined: 13 August 2014

Interests: No data

Max Almenas

Max Almenas, 53

Joined: 10 August 2014

Interests: No data

29Jan

29Jan, 32

Joined: 29 January 2014

Interests: No data

s82 s82

s82 s82, 26

Joined: 16 April 2014

Interests: No data

Wicca

Wicca, 37

Joined: 18 June 2014

Interests: No data

Phebe Paul

Phebe Paul, 27

Joined: 08 September 2014

Interests: No data

Артем Ступаков

Артем Ступаков, 93

Joined: 29 January 2014

About myself: Радуюсь жизни!

Interests: No data

sergei jkovlev

sergei jkovlev, 59

Joined: 03 November 2019

Interests: музыка, кино, автомобили

Алексей Гено

Алексей Гено, 8

Joined: 25 June 2015

About myself: Хай

Interests: Интерес1daasdfasf, http://apple.com

technetonlines

technetonlines

Joined: 24 January 2019

Interests: No data



Main article: Cloud computing

<< Back Forward >>
Topics from 1 to 10 | in all: 530

AWS partners with Kenya’s Safaricom on cloud and consulting services

08:25 | 28 February

Amazon Web Services has entered a partnership with Safaricom — Kenya’s largest telco, ISP and mobile payment provider — in a collaboration that could spell competition between American cloud providers in Africa.

In a statement to TechCrunch, the East African company framed the arrangement as a “strategic agreement” whereby Safaricom will sell AWS services (primarily cloud) to its East Africa customer network.

Safaricom — whose products include the famed M-Pesa mobile money product — will also become the first Advanced Consulting Partner for the AWS partner network in East Africa.

“The APN is…the program for technology…businesses who leverage AWS to build solutions and services for customers…and sell their AWS offerings by providing valuable business, technical, and marketing support,” Safaricom said.

“We chose to partner with AWS because it offers customers the broadest and deepest cloud platform…This agreement will allow us to accelerate our efforts to enable digital transformation in Kenya,” said Safaricom CEO Michael Joseph.

“Safaricom will be able to offer AWS services to East-African customers, allowing businesses of all sizes to quickly get started on AWS cloud,” the company statement continued.

For now, the information provided by Safaricom is a bit sparse on the why and how of the partnership between the American company and East African mobile, financial and ISP provider.

TechCrunch has an inquiry into Amazon and some additional questions posed to Safaricom, toward additional coverage.

An initial what-this-all-means take on the partnership points to an emerging competition between American cloud service providers to scale in Africa by leveraging networks of local partners.

The most obvious rival to the AWS-Safaricom strategic agreement is the Microsoft -Liquid Telecom collaboration. Since 2017, MS has partnered with the Southern African digital infrastructure company to grow Microsoft’s AWS competitor product — Azure — and offer cloud services to the continent’s startups and established businesses.

MS and Liquid Telecom have focused heavily on the continent’s young tech companies. “We believe startups will be key employers in Africa’s future economy. They’re also our future customers,” Liquid Telecom’s  Head of Innovation Partnerships Oswald Jumira told TechCrunch in 2018.

Amazon hasn’t gone fully live yet with e-commerce services in Africa, but it has aggressively positioned AWS and built a regional client list that includes startups — such as fintech venture Jumo — and large organizations, such Absa and Standard Bank.

Partnering with Safaricom plugs AWS into the network of one East Africa’s most prominent digital companies.

Safaricom, led primarily by its M-Pesa mobile money product, holds remarkable dominance in Kenya, Africa’s 6th largest economy. M-Pesa has 20.5 million customers across a network of 176,000 agents and generates around one-fourth ($531 million) of Safaricom’s ≈ $2.2 billion annual revenues (2018).

Compared to other players — such as Airtel  Money and Equitel Money — M-Pesa has 80% of Kenya’s mobile money agent network, 82% of the country’s active mobile-money subscribers and transfers 80% of Kenya’s mobile-money transactions, per the latest sector statistics.

A number of Safaricom’s clients (including those it provides payments and internet services to) are companies, SMEs and startups.

Extending AWS services to them will play out next to the building of Microsoft’s $100 million Africa Development Center, with an office in Nairobi, announced last year.

 


0

VC firm Oxx says SaaS startups should avoid high-risk growth models

02:47 | 25 February

Oxx, a European venture capital firm co-founded by Richard Anton and Mikael Johnsson, this month announced the closing of its debut fund of $133 million to back “Europe’s most promising SaaS companies” at Series A and beyond.

Launched in 2017 and headquartered in London and Stockholm, Oxx pitches itself as one of only a few European funds focused solely on SaaS, and says it will invest broadly across software applications and infrastructure, highlighting five key themes: “data convergence & refinery,” “future of work,” “financial services infrastructure,” “user empowerment” and “sustainable business.”

However, its standout USP is that the firm says it wants to be a more patient form of capital than investors who have a rigid Silicon Valley SaaS mindset, which, it says, often places growth ahead of building long-lasting businesses.

I caught up with Oxx’s co-founders to dig deeper into their thinking, both with regards to the firm’s remit and investment thesis, and to learn more about the pair’s criticism of the prevailing venture capital model they say often pushes SaaS companies to prioritize “grow at all costs.”

TechCrunch: Oxx is described as a B2B software investor investing in SaaS companies across Europe from Series A and beyond. Can you be more specific regarding the size of check you write and the types of companies, geographies, technologies and business models you are focusing on?

Richard Anton: We will lead funding rounds anywhere in the range $5-20 million in SaaS companies. Some themes we’re especially excited about include data convergence and the refining and usage of data (think applications of machine learning, for example), the future of work, financial services infrastructure, end-user empowerment and sustainable business.

 


0

Mirantis co-founder launches FreedomFi to bring private LTE networks to enterprises

02:22 | 25 February

Boris Renski, the co-founder of Mirantis, one of the earliest and best-funded players in the OpenStack space a few years ago (which then mostly pivoted to Kubernetes and DevOps), has left his role as CMO to focus his efforts on a new startup: FreedomFi. The new company brings together open-source hardware and software to give enterprises a new way to leverage the newly opened 3.5 GHz band for private LTE and — later — 5G IoT deployments.

“There is a very broad opportunity for any enterprise building IoT solutions, which completely changes the dynamic of the whole market,” Renski told me when I asked him why he was leaving Mirantis. “This makes the whole space very interesting and fast-evolving. I felt that my background in open source and my existing understanding of the open-source landscape and the LTE space […] is an extremely compelling opportunity to dive into headfirst.”

Renski told me that a lot of the work the company is doing is still in its early stages, but the company recently hit a milestone when it used its prototype stack to send messages across its private network over a distance of around 2.7 miles.

Mirantis itself worked on bringing Magma, a Facebook-developed open-source tool for powering some of the features needed for building access networks, into production. FreedomFi is also working with the OpenAirInterface consortium, which aims to create an ecosystem for open-source software and hardware development around wireless innovation. Most, if not all, of the technology the company will develop over time will also be open source, as well.

Renski, of course, gets to leverage his existing connections in the enterprise and telco industry with this new venture, but he also told me that he plans to leverage the Mirantis playbook as he builds out the company.

“At Mirantis, our journey was that we started with basically offering end-to-end open-source cloud buildouts to a variety of enterprises back when OpenStack was essentially the only open-source cloud project out there,” he explained. “And we spent a whole bunch of time doing that, engaging with customers, getting customer revenue, learning where the bottlenecks are — and then kind of gradually evolving into more of a leveraged business model with a subscription offering around OpenStack and then MCP and now Kubernetes, Docker, etc. But the key was to be very kind of customer-centric, go get some customer wins first, give customers a services-centric offering that gets them to the result, and then figure out where the leveraged business model opportunities are.”

Currently, enterprises that want to attempt to build their own private LTE networks — and are willing to spend millions on it — have to go to the large telecom providers. Those companies, though, aren’t necessarily interested in working on these relatively small deployments (or at least “small” by the standards of a telco).

Renski and his team started the project about two months ago and for now, it remains self-funded. But the company already has five pilots lined up, including one with a company that produces large-scale events and another with a large real estate owner, and with some of the tech falling in place, Renski seems optimistic that this is a project worth focusing on. There are still some hurdles to overcome and Renski tells me the team is learning new things every day. The hardware, for example, remains hard to source and the software stack remains in flux. “We’re probably at least six months away from having solved all of the technology and business-related problems pertaining to delivering this kind of end-to-end private LTE network,” he said.

 


0

Databricks makes bringing data into its ‘lakehouse’ easier

17:00 | 24 February

Databricks today announced that launch of its new Data Ingestion Network of partners and the launch of its Databricks Ingest service. The idea here is to make it easier for businesses to combine the best of data warehouses and data lakes into a single platform — a concept Databricks likes to call ‘lakehouse.’

At the core of the company’s lakehouse is Delta Lake, Databricks’ Linux Foundation-managed open-source project that brings a new storage layer to data lakes that helps users manage the lifecycle of their data and ensures data quality through schema enforcement, log records and more. Databricks users can now work with the first five partners in the Ingestion Network — Fivetran, Qlik, Infoworks, StreamSets, Syncsort — to automatically load their data into Delta Lake. To ingest data from these partners, Databricks customers don’t have to set up any triggers or schedules — instead, data automatically flows into Delta Lake.

“Until now, companies have been forced to split up their data into traditional structured data and big data, and use them separately for BI and ML use cases. This results in siloed data in data lakes and data warehouses, slow processing and partial results that are too delayed or too incomplete to be effectively utilized,” says Ali Ghodsi, co-founder and CEO of Databricks. “This is one of the many drivers behind the shift to a Lakehouse paradigm, which aspires to combine the reliability of data warehouses with the scale of data lakes to support every kind of use case. In order for this architecture to work well, it needs to be easy for every type of data to be pulled in. Databricks Ingest is an important step in making that possible.”

Databricks VP or Product Marketing Bharath Gowda also tells me that this will make it easier for businesses to perform analytics on their most recent data and hence be more responsive when new information comes in. He also noted that users will be able to better leverage their structured and unstructured data for building better machine learning models, as well as to perform more traditional analytics on all of their data instead of just a small slice that’s available in their data warehouse.

 

 


0

Rallyhood exposed a decade of users’ private data

02:00 | 24 February

Rallyhood says it’s “private and secure.” But for some time, it wasn’t.

The social network designed to helping groups communicate and coordinate left one of its cloud storage buckets open and exposed. The bucket, hosted on Amazon Web Services (AWS), was not protected with a password, allowing anyone who knew the easily-guessable web address access to a decade’s worth of user files.

Rallyhood boasts users from Girl Scout and Boy Scout troops, and Komen, Habitat for Humanities, and YMCA factions. The company also hosts thousands of smaller groups, like local bands, sports teams, art clubs, and organizing committees. Many flocked to the site after Rallyhood said it would help migrate users from Yahoo Groups, after Verizon (which also owns TechCrunch) said it would shut down the discussion forum site last year.

The bucket contained group data as far back to 2011 up to and including last month. In total, the bucket contained 4.1 terabytes of uploaded files, representing millions of users’ files.

Some of the files we reviewed contained sensitive data, like shared password lists and contracts or other permission slips and agreements. The documents also included non-disclosure agreements and other files that were not intended to be public.

Where we could identify contact information of users whose information was exposed, TechCrunch reached out to verify the authenticity of the data.

A security researcher who goes by the handle Timeless found the exposed bucket and informed TechCrunch, so that the bucket and its files could be secured.

When reached, Rallyhood chief technology officer Chris Alderson initially claimed that the bucket was for “testing” and that all user data was stored “in a highly secured bucket,” but later admitted that during a migration project, “there was a brief period when permissions were mistakenly left open.”

It’s not known if Rallyhood plans to warn its users and customers of the security lapse. At the time of writing, Rallyhood has made no statement on its website or any of its social media profiles of the incident.

 


0

A group of ex-NSA and Amazon engineers are building a ‘GitHub for data’

16:00 | 20 February

Six months ago or thereabouts, a group of engineers and developers with backgrounds from the National Security Agency, Google and Amazon Web Services had an idea.

Data is valuable for helping developers and engineers to build new features and better innovate. But that data is often highly sensitive and out of reach, kept under lock and key by red tape and compliance, which can take weeks to get approval. So, the engineers started Gretel, an early-stage startup that aims to help developers safely share and collaborate with sensitive data in real time.

It’s not as niche of a problem as you might think, said Alex Watson, one of the co-founders. Developers can face this problem at any company, he said. Often, developers don’t need full access to a bank of user data — they just need a portion or a sample to work with. In many cases, developers could suffice with data that looks like real user data.

“It starts with making data safe to share,” Watson said. “There’s all these really cool use cases that people have been able to do with data.” He said companies like GitHub, a widely used source code sharing platform, helped to make source code accessible and collaboration easy. “But there’s no GitHub equivalent for data,” he said.

And that’s how Watson and his co-founders, John Myers, Ali Golshan and Laszlo Bock came up with Gretel.

“We’re building right now software that enables developers to automatically check out an anonymized version of the data set,” said Watson. This so-called “synthetic data” is essentially artificial data that looks and works just like regular sensitive user data. Gretel uses machine learning to categorize the data — like names, addresses and other customer identifiers — and classify as many labels to the data as possible. Once that data is labeled, it can be applied access policies. Then, the platform applies differential privacy — a technique used to anonymize vast amounts of data — so that it’s no longer tied to customer information. “It’s an entirely fake data set that was generated by machine learning,” said Watson.

It’s a pitch that’s already gathering attention. The startup has raised $3.5 million in seed funding to get the platform off the ground, led by Greylock Partners, and with participation from Moonshots Capital, Village Global and several angel investors.

“At Google, we had to build our own tools to enable our developers to safely access data, because the tools that we needed didn’t exist,” said Sridhar Ramaswamy, a former Google executive, and now a partner at Greylock.

Gretel said it will charge customers based on consumption — a similar structure to how Amazon prices access to its cloud computing services.

“Right now, it’s very heads-down and building,” said Watson. The startup plans to ramp up its engagement with the developer community in the coming weeks, with an eye on making Gretel available in the next six months, he said.

 


0

Thomas Kurian on his first year as Google Cloud CEO

00:15 | 20 February

“Yes.”

That was Google Cloud CEO Thomas Kurian’s simple answer when I asked if he thought he’d achieved what he set out to do in his first year.

A year ago, he took the helm of Google’s cloud operations — which includes G Suite — and set about giving the organization a sharpened focus by expanding on a strategy his predecessor Diane Greene first set during her tenure.

It’s no secret that Kurian, with his background at Oracle, immediately put the entire Google Cloud operation on a course to focus on enterprise customers, with an emphasis on a number of key verticals.

So it’s no surprise, then, that the first highlight Kurian cited is that Google Cloud expanded its feature lineup with important capabilities that were previously missing. “When we look at what we’ve done this last year, first is maturing our products,” he said. “We’ve opened up many markets for our products because we’ve matured the core capabilities in the product. We’ve added things like compliance requirements. We’ve added support for many enterprise things like SAP and VMware and Oracle and a number of enterprise solutions.” Thanks to this, he stressed, analyst firms like Gartner and Forrester now rank Google Cloud “neck-and-neck with the other two players that everybody compares us to.”

If Google Cloud’s previous record made anything clear, though, it’s that technical know-how and great features aren’t enough. One of the first actions Kurian took was to expand the company’s sales team to resemble an organization that looked a bit more like that of a traditional enterprise company. “We were able to specialize our sales teams by industry — added talent into the sales organization and scaled up the sales force very, very significantly — and I think you’re starting to see those results. Not only did we increase the number of people, but our productivity improved as well as the sales organization, so all of that was good.”

He also cited Google’s partner business as a reason for its overall growth. Partner influence revenue increased by about 200% in 2019, and its partners brought in 13 times more new customers in 2019 when compared to the previous year.

 


0

Google Cloud opens its Seoul region

20:43 | 19 February

Google Cloud today announced that its new Seoul region, its first in Korea, is now open for business. The region, which it first talked about last April, will feature three availability zones and support for virtually all of Google Cloud’s standard service, ranging from Compute Engine to BigQuery, Bigtable and Cloud Spanner.

With this, Google Cloud now has a presence in 16 countries and offers 21 regions with a total of 64 zones. The Seoul region (with the memorable name of asia-northeast3) will complement Google’s other regions in the area, including two in Japan, as well as regions in Hong Kong and Taiwan, but the obvious focus here is on serving Korean companies with low-latency access to its cloud services.

“As South Korea’s largest gaming company, we’re partnering with Google Cloud for game development, infrastructure management, and to infuse our operations with business intelligence,” said Chang-Whan Sul, the CTO of Netmarble. “Google Cloud’s region in Seoul reinforces its commitment to the region and we welcome the opportunities this initiative offers our business.”

Over the course of this year, Google Cloud also plans to open more zones and regions in Salt Lake City, Las Vegas and Jakarta, Indonesia.

 


0

Google Cloud acquires mainframe migration service Cornerstone

18:00 | 19 February

Google today announced that it has acquired Cornerstone, a Dutch company that specializes in helping enterprise migrate their legacy workloads from mainframes to public clouds. Cornerstone, which provides very hands-on migration assistance, will form the basis of Google Cloud’s mainframe-to-GCP solutions.

This move is very much in line with Google Cloud’s overall enterprise strategy, which focuses on helping existing enterprises move their legacy workloads into the cloud (and start new projects as cloud-native solutions from the get-go).

“This is one more example of how Google Cloud is helping enterprise customers modernize their infrastructure and applications as they transition to the cloud,” said John Jester, VP of Customer Experience at Google Cloud. “We’ve been making great strides to better serve enterprise customers, including introducing Premium Support, better aligning our Customer Success organization, simplifying our commercial contracting process to make it easier to do business with Google Cloud, and expanding our partner relationships.”

A lot of businesses still rely on their mainframes to power mission-critical workloads. Moving them to the cloud is often a very complex undertaking, which is where Cornerstone and similar vendors come in. It doesn’t help that a lot of these mainframe applications were written in Cobol, PL/1 or assembly. Cornerstone’s technology can automatically break these processes down into cloud-native services that are then managed within a containerized environment. It can also migrate databases as needed.

It’s worth noting that Google Cloud also recently introduced support for IBM Power Systems in its cloud. This, too, was a move to help enterprises move their legacy systems into the cloud. With Cornerstone, Google Cloud adds yet another layer on top of this by providing even more hands-on migration assistance for users who want to slowly modernize their overall stack without having to re-architect all of their legacy applications.

 

 


0

How Spotify ran the largest Google Dataflow job ever for Wrapped 2019

20:30 | 18 February

In early December, Spotify launched its annual personalized Wrapped playlist with its users’ most-streamed sounds of 2019. That has become a bit of a tradition and isn’t necessarily anything new, but for 2019, it also gave users a look back at how they used Spotify over the last decade. Because this was quite a large job, Spotify gave us a bit of a look under the covers of how it generated these lists for its ever-growing number of free and paid subscribers.

It’s no secret that Spotify is a big Google Cloud Platform user. Back in 2016, the music streaming service publicly said that it was going to move to Google Cloud, after all, and in 2018, it disclosed that it would spend at least $450 million on its Google Cloud infrastructure in the following three years.

It was also back in 2018, for that year’s Wrapped, that Spotify ran the largest Google Cloud Dataflow job ever run on the platform, a service the company started experimenting with a few years earlier. “Back in 2015, we built and open-sourced a big data processing Scala API for Apache Beam and Google Cloud Dataflow called Scio,” Spotify’s VP of Engineering Tyson Singer told me. “We chose Dataflow over Dataproc because it scales with less operational overhead and Dataflow fit with our expected needs for streaming processing. Now we have a great open-source toolset designed and optimized for Dataflow, which in addition to being used by most internal teams, is also used outside of Spotify.”

For Wrapped 2019, which includes the annual and decadal lists, Spotify ran a job that was five times larger than in 2018 — but it did so at three-quarters of the cost. Singer attributes this to his team’s familiarity with the platform. “With this type of global scale, complexity is a natural consequence. By working closely with Google Cloud’s engineering teams and specialists and drawing learnings from previous years, we were able to run one of the most sophisticated Dataflow jobs ever written.”

Still, even with this expertise, the team couldn’t just iterate on the full data set as it figured out how to best analyze the data and use it to tell the most interesting stories to its users. “Our jobs to process this would be large and complex; we needed to decouple the complexity and processing in order to not overwhelm Google Cloud Dataflow,” Singer said. “This meant that we had to get more creative when it came to going from idea, to data analysis, to producing unique stories per user, and we would have to scale this in time and at or below cost. If we weren’t careful, we risked being wasteful with resources and slowing down downstream teams.”

To handle this workload, Spotify not only split its internal teams into three groups (data processing, client-facing and design, and backend systems), but also split the data processing jobs into smaller pieces. That marked a very different approach for the team. “Last year Spotify had one huge job that used a specific feature within Dataflow called “Shuffle.” The idea here was that having a lot of data, we needed to sort through it, in order to understand who did what. While this is quite powerful, it can be costly if you have large amounts of data.”

This year, the company’s engineers minimized the use of Shuffle by using Google Cloud’s Bigtable as an intermediate storage layer. “Bigtable was used as a remediation tool between Dataflow jobs in order for them to process and store more data in a parallel way, rather than the need to always regroup the data,” said Singer. “By breaking down our Dataflow jobs into smaller components — and reusing core functionality — we were able to speed up our jobs and make them more resilient.”

Singer attributes at least a part of the cost savings to this technique of using Bigtable, but he also noted that the team decomposed the problem into data collection, aggregation and data transformation jobs, which it then split into multiple separate jobs. “This way, we were not only able to process more data in parallel, but be more selective about which jobs to rerun, keeping our costs down.”

Many of the techniques the engineers on Singer’s teams developed are currently in use across Spotify. “The great thing about how Wrapped works is that we are able to build out more tools to understand a user, while building a great product for them,” he said. “Our specialized techniques and expertise of Scio, Dataflow and big data processing, in general, is widely used to power Spotify’s portfolio of products.”

 


0
<< Back Forward >>
Topics from 1 to 10 | in all: 530

Site search


Last comments

Walmart retreats from its UK Asda business to hone its focus on competing with Amazon
Peter Short
Good luck
Peter Short

Evolve Foundation launches a $100 million fund to find startups working to relieve human suffering
Peter Short
Money will give hope
Peter Short

Boeing will build DARPA’s XS-1 experimental spaceplane
Peter Short
Great
Peter Short

Is a “robot tax” really an “innovation penalty”?
Peter Short
It need to be taxed also any organic substance ie food than is used as a calorie transfer needs tax…
Peter Short

Twitter Is Testing A Dedicated GIF Button On Mobile
Peter Short
Sounds great Facebook got a button a few years ago
Then it disappeared Twitter needs a bottom maybe…
Peter Short

Apple’s Next iPhone Rumored To Debut On September 9th
Peter Short
Looks like a nice cycle of a round year;)
Peter Short

AncestryDNA And Google’s Calico Team Up To Study Genetic Longevity
Peter Short
I'm still fascinated by DNA though I favour pure chemistry what could be
Offered is for future gen…
Peter Short

U.K. Push For Better Broadband For Startups
Verg Matthews
There has to an email option icon to send to the clowns in MTNL ... the govt of India's service pro…
Verg Matthews

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short