Прогноз погоды

People

John Smith

John Smith, 47

Joined: 28 January 2014

Interests: No data

Jonnathan Coleman

Jonnathan Coleman, 31

Joined: 18 June 2014

About myself: You may say I'm a dreamer

Interests: Snowboarding, Cycling, Beer

Andrey II

Andrey II, 39

Joined: 08 January 2014

Interests: No data

David

David

Joined: 05 August 2014

Interests: No data

David Markham

David Markham, 63

Joined: 13 November 2014

Interests: No data

Michelle Li

Michelle Li, 39

Joined: 13 August 2014

Interests: No data

Max Almenas

Max Almenas, 51

Joined: 10 August 2014

Interests: No data

29Jan

29Jan, 30

Joined: 29 January 2014

Interests: No data

s82 s82

s82 s82, 25

Joined: 16 April 2014

Interests: No data

Wicca

Wicca, 35

Joined: 18 June 2014

Interests: No data

Phebe Paul

Phebe Paul, 25

Joined: 08 September 2014

Interests: No data

Артем 007

Артем 007, 40

Joined: 29 January 2014

About myself: Таки да!

Interests: Норвегия и Исландия

Alexey Geno

Alexey Geno, 7

Joined: 25 June 2015

About myself: Хай

Interests: Интерес1daasdfasf

Verg Matthews

Verg Matthews, 66

Joined: 25 June 2015

Interests: No data

CHEMICALS 4 WORLD DEVEN DHELARIYA

CHEMICALS 4 WORLD…, 32

Joined: 22 December 2014

Interests: No data



Main article: Developer

<< Back Forward >>
Topics from 1 to 10 | in all: 699

Open source sustainability

22:25 | 23 June

Open source sustainability has been nothing short of an oxymoron. Engineers around the world pour their sweat and frankly, their hearts into these passion projects that undergird all software in the modern internet economy. In exchange, they ask for nothing in return except for recognition and help in keeping their projects alive and improving them. It’s an incredible movement of decentralized voluntarism and represents humanity at its best.

The internet and computing giants — the heaviest users of open source in the world — are collectively worth trillions of dollars, but you would be remiss in thinking that their wealth has somehow trickled down to the maintainers of the open source projects that power them. Working day jobs, maintainers today can struggle to find the time to fix critical bugs, all the while facing incessant demands from users requesting free support on GitHub. Maintainer burnout is a monstrous challenge.

That distressing situation was chronicled almost exactly two years ago by Nadia Eghbal, in a landmark report on the state of open source published by the Ford Foundation. Comparing open source infrastructure to “roads and bridges,” Eghbal provided not just a comprehensive overview of the challenges facing open source, but also a call-to-arms for more users of open source to care about its economics, and ultimately, how these critical projects can sustain themselves indefinitely.

Two years later, a new crop of entrepreneurs, open source maintainers, and organizations have taken Eghbal up on that challenge, developing solutions that maintain the volunteer spirit at the heart of open source while inventing new economic models to make the work sustainable. All are early, and their long-term effects on the output and quality of open source are unknown. But each solution offers an avenue that could radically change the way we think of a career in open source in the future.

No one sees that the Roads and Bridges are falling down

Eghbal’s report two years ago summarized the vast issues facing open source maintainers, challenges that have remained essentially unchanged in the interim. It’s a quintessential example of the “tragedy of the commons.” As Edghbal wrote at the time, “Fundamentally, digital infrastructure has a free rider problem. Resources are offered for free, and everybody (whether individual developer or large software company) uses them, so nobody is incentivized to contribute back, figuring that somebody else will step in.” That has led to a brittle ecosystem, just as open source software reached the zenith of its influence.

The challenges, though, go deeper. It’s not just that people are free riding, it’s often that they don’t even realize it. Software engineers can easily forget just how much craftsmanship has gone into the open source code that powers the most basic of applications. NPM, the company that powers the module repository for the Node ecosystem, has nearly 700,000 projects listed on its registry. Starting a new React app recently, NPM installed 1105 libraries with my initial project in just a handful of seconds. What are all of these projects?

And more importantly, who are all the people behind them? That dependency tree of libraries abstracts all the people whose work has made those libraries available and functional in the first place. That black box can make it difficult to see that there are far fewer maintainers working behind the scenes at each of these open source projects than what one might expect, and that those maintainers may be struggling to work on those libraries due to lack of funding.

Eghbal pointed to OpenSSL as an example, a library that powers a majority of encrypted communications on the web. Following the release of the Heartbleed security bug, people were surprised to learn that the OpenSSL project was the work of a very small team of individuals, with only one of them working on it full-time (and at a very limited salary compared to industry norms).

Such a situation isn’t unusual. Open source projects often have many contributors, but only a handful of individuals are truly driving a particular project forward. Lose that singular force either to burnout or distraction, and a project can be adrift quickly.

When free isn’t free

No one wants open source to disappear, or for maintainers to burnout. Yet, there is a strong cultural force against commercial interests in the community. Money is corrupting, and dampens the voluntary spirit of open source efforts. More pragmatically, there are vast logistical challenges with managing money on globally distributed volunteer teams that can make paying for work logistically challenging.

Unsurprisingly, the vanguard of open source sustainability sees things very differently. Kyle Mitchell, a lawyer by trade and founder of License Zero, says that there is an assumption that “Open source will continue to fall from the sky like manna from heaven and that the people behind it can be abstracted away.” He concludes: “It is just really wrong.”

That view was echoed by Henry Zhu, who is the maintainer of the popular JavaScript compiler Babel. “We trust startups with millions of VC money and encourage a culture of ‘failing fast,’ yet somehow the idea of giving to volunteers who may have showed years of dedication is undesirable?” he said.

Xavier Damman, the founder and CEO of Open Collective, says that “In every community, there are always going to be extremists. I hear them and understand them, and in an ideal world, we all have universal basic income, and I would agree with them.” Yet, the world hasn’t moved to such an income model, and so supporting the work of open source has to be an option. “Not everyone has to raise money for the open source community, but the people who want to, should be able to and we want to work with them,” he said.

Mitchell believes that one of the most important challenges is just getting comfortable talking about money. “Money feels dirty until it doesn’t,” he said. “I would like to see more money responsibility in the community.” One challenge he notes is that “learning to be a great maintainer doesn’t teach you how to be a great open source contractor or consultant.” GitHub works great as a code repository service, but ultimately doesn’t teach maintainers the economics of their work.

Supporting the individual contributor: Patreon and License Zero

Perhaps the greatest debate in sustaining open source is deciding who or what to target: the individual contributors — who often move between multiple projects — or a particular library itself.

Take Feross Aboukhadijeh for example. Aboukhadijeh (who, full disclosure, was once my college roommate at Stanford almost a decade ago) has become a major force in the open source world, particularly in the Node ecosystem. He served an elected term on the board of directors of the Node.js Foundation, and has published 125 repositories on GitHub, including popular projects like WebTorrent (with 17,000 stars) and Standard (18,300 stars).

Aboukhadijeh was looking for a way to spend more time on open source, but didn’t want to be beholden to working on a single project or writing code at a private company that would never see the light of day. So he turned to Patreon as a means of support.

(Disclosure: CRV, my most immediate former employer, is the series A investor in Patreon. I have no active or passive financial interest in this specific company. As per my ethics statement, I do not write about CRV’s portfolio companies, but given that this essay focuses on open source, I made an exception).

Patreon is a crowdsourced subscription platform, perhaps best known for the creatives it hosts. These days though, it is also increasingly being used by notable open source contributors as a way to connect with fans and sustain their work. Aboukhadijeh launched his page after seeing others doing it. “A bunch of people were starting up Patreons, which was kind of a meme in my JavaScript circles,” he said. His Patreon page today has 72 contributors providing him with $2,874 in funding per month ($34,488 annually).

That may seem a bit paltry, but he explained to me that he also supplements his Patreon with funding from organizations as diverse as Brave (an adblocking browser with a utility token model) to PopChest (a decentralized video sharing platform). That nets him a couple of more thousands of dollars per month.

Aboukhadijeh said that Twitter played an outsized role in building out his revenue stream. “Twitter is the most important on where the developers talk about stuff and where conversations happen…,” he said. “The people who have been successful on Patreon in the same cohort [as me] who tweet a lot did really well.”

For those who hit it big, the revenues can be outsized. Evan You, who created the popular JavaScript frontend library Vue.js, has reached $15,206 in monthly earnings ($182,472 a year) from 231 patrons. The number of patrons has grown consistently since starting his Patreon in March 2016 according to Graphtreon, although earnings have gone up and down over time.

Aboukhadijeh noted that one major benefit was that he had ownership over his own funds. “I am glad I did a Patreon because the money is mine,” he said.

While Patreon is one direct approach for generating revenues from users, another one is to offer dual licenses, one free and one commercial. That’s the model of License Zero, which Kyle Mitchell propsosed last year. He explained to me that “License Zero is the answer to a really simple question with no simple answers: how do we make open source business models open to individuals?”

Mitchell is a rare breed: a lifelong coder who decided to go to law school. Growing up, he wanted to use software he found on the web, but “if it wasn’t free, I couldn’t download it as a kid,” he said. “That led me into some of the intellectual property issues that paved a dark road to the law.”

License Zero is a permissive license based on the two-clause BSD license, but adds terms requiring commercial users to pay for a commercial license after 90 days, allowing companies to try a project before purchasing it. If other licenses aren’t available for purchase (say, because a maintainer is no longer involved), then the language is no longer enforceable and the software is offered as fully open source. The idea is that other open source users can always use the software for free, but for-profit uses would require a payment.

Mitchell believes that this is the right approach for individuals looking to sustain their efforts in open source. “The most important thing is the time budget – a lot of open source companies or people who have an open source project get their money from services,” he said. The problem is that services are exclusive to a company, and takes time away from making a project as good as it can be. “When moneymaking time is not time spent on open source, then it competes with open source,” he said.

License Zero is certainly a cultural leap away from the notion that open source should be free in cost to all users. Mitchell notes though that “companies pay for software all the time, and they sometimes pay even when they could get it for free.” Companies care about proper licensing, and that becomes the leverage to gain revenue while still maintaining the openness and spirit of open source software. It also doesn’t force open source maintainers to take away critical functionality — say a management dashboard or scaling features — to force a sale.

Changing the license of existing projects can be challenging, so the model would probably best be used by new projects. Nonetheless, it offers a potential complement or substitute to Patreon and other subscription platforms for individual open source contributors to find sustainable ways to engage in the community full-time while still putting a roof over their heads.

Supporting the organization: Tidelift and Open Collective

Supporting individuals makes a lot of sense, but often companies want to support the specific projects and ecosystems that underpin their software. Doing so can be next to impossible. There are complicated logistics required in order for companies to fund open source, such as actually having an organization to send money to (and for many, to convince the IRS that the organization is actually a non-profit). Tidelift and Open Collective are two different ways to open up those channels.

Tidelift is the brainchild of four open-source fanatics led by Donald Fischer. Fischer, who is CEO, is a former venture investor at General Catalyst and Greylock as well as a long-time executive at Red Hat. In his most recent work, Fischer invested in companies at the heart of open source ecosystems, such as Anaconda (which focuses on scientific and statistical computing within Python), Julia Computing (focused on the Julia programming language), Ionic (a cross-platform mobile development framework), and TypeSafe now Lightbend (which is behind the Scala programming language).

Fischer and his team wanted to create a platform that would allow open source ecosystems to sustain themselves. “We felt frustrated at some level that while open source has taken over a huge portion of software, a lot of the creators of open source have not been able to capture a lot of the value they are creating,” he explained.

Tidelift is designed to offer assurances “around areas like security, licensing, and maintenance of software,” Fischer explained. The idea has its genesis in Red Hat, which commercialized Linux. The idea is that companies are willing to pay for open source when they can receive guarantees around issues like critical vulnerabilities and long-term support. In addition, Tidelift handles the mundane tasks of setting up open source for commercialization such as handling licensing issues.

Fischer sees a mutualism between companies buying Tidelift and the projects the startup works with. “We are trying to make open source better for everyone involved, and that includes both the creators and users of open source,” he said. “What we focus on is getting these issues resolved in the upstream open source project.” Companies are buying assurances, but not exclusivity, so if a vulnerability is detected for instance, it will be fixed for everyone.

Tidelift initially launched in the JavaScript ecosystem around React, Angular, and Vue.js, but will expand to more communities over time. The company has raised $15 million in venture capital from General Catalyst and Foundry Group, plus former Red Hat chairman and CEO Matthew Szulik.

Fischer hopes that the company can change the economics for open source contributors. He wants the community to move from a model of “get by and survive” with a “subsistence level of earnings” and instead, help maintainers of great software “win big and be financially rewarded for that in a significant way.”

Where Tidelift is focused on commercialization and software guarantees, Open Collective wants to open source the monetization of open source itself.

Open Collective is a non-profit platform that provides tools to “collectives” to receive money while also offering mechanisms to allow the members of those collectives to spend their money in a democratic and transparent way.

Take, for instance, the open collective sponsoring Babel. Babel today receives an annual budget of $113,061 from contributors. Even more interesting though is that anyone can view how the collective spends its money. Babel currently has $28,976.82 in its account, and every expense is listed. For instance, core maintainer Henry Zhu, who we met earlier in this essay, expensed $427.18 on June 2nd for two weeks worth of Lyft rides in SF and Seattle.

Xavier Damman, CEO and founder of Open Collective, believes that this radical transparency could reshape how the economics of open source are considered by its participants. Damman likens Open Collective to the “View Source” feature of a web browser that allows users to read a website’s code. “Our goal as a platform is to be as transparent as possible,” he said.

Damman was formerly the founder of Storify. Back then, he built an open source project designed to help journalists accept anonymous tips, which received a grant. The problem was that “I got a grant, and I didn’t know what to do with the money.” He thought of giving it to some other open source projects, but “technically, it was just impossible.” Without legal entities or paperwork, the money just wasn’t fungible.

Open Collective is designed to solve those problems. Open Collective itself is a 501(c)6 non-profit, and it technically receives all money destined for any of the collectives hosted on its platform as their fiscal sponsor. That allows the organization to send out invoices to companies, providing them with the documentation they need in order to write a check. “As long as they have an invoice, they are covered,” Damman explained.

Once a project has money, it is up to the maintainers of that community to decide how to spend it. “It is up to each community to define their own rules,” Damman said. He notes that open source contributors can often spend the money on the kind of uninteresting work that doesn’t normally get done, which Damman analogized as “pay people to keep the place clean.” No one wants to clean a public park, but if no one does it, then no one will ever use the park. He also noted that in-person meetings are a popular usage of revenues.

Open Collective was launched in late 2015, and since then has become home to 647 open source projects. So far, Webpack, the popular JavaScript build tool, has generated the most revenue, currently sitting at $317,188 a year. One major objective of the non-profit is to encourage more for-profit companies to commit dollars to open source. Open Collective places the logos of major donors on each collective page, giving them visible credit for their commitment to open source.

Damman’s ultimate dream is to change the notion of ownership itself. We can move from “Competition to collaboration, but also ownership to commons,” he envisioned.

Sustaining sustainability

It’s unfortunately very early days for open source sustainability. While Patreon, License Zero, Tidelift, and Open Collective are different approaches to providing the infrastructure for sustainability, ultimately someone has to pay to make all that infrastructure useful. There are only a handful of Patreons that could substitute for an engineer’s day job, and only two collectives by my count on Open Collective that could support even a single maintainer full time. License Zero and Tidelift are too new to know how they will perform yet.

Ultimately though, we need to change the culture toward sustainability. Henry Zhu of Babel commented, “The culture of our community should be one that gives back and supports community projects with all that they can: whether with employee time or funding. Instead of just embracing the consumption of open source and ignoring the cost, we should take responsibility for it’s sustainability.”

In some ways, we are merely back to the original free rider problem in the tragedy of the commons — someone, somewhere has to pay, but all get to share in the benefits.

The change though can happen through all of us who work on code — every software engineer and product manager. If you work at a for-profit company, take the lead in finding a way to support the code that allows you to do your job so efficiently. The decentralization and volunteer spirit of the open source community needs exactly the same kind of decentralized spirit in every financial contributor. Sustainability is each of our jobs, every day. If we all do our part, we can help to sustain one of the great intellectual movements humanity has ever created, and end the oxymoron of open source sustainability forever.

 


0

Facebook mistakenly leaked developer analytics reports to testers

19:57 | 22 June

Set the “days without a Facebook’s privacy problem” counter to zero. This week, an alarmed developer contacted TechCrunch, informing us that their Facebook App Analytics weekly summary email had been delivered to someone outside their company. It contains sensitive business information including weekly average users, page views, and new users.

After two days of Facebook investigating, the social network now confirms to TechCrunch that 3 percent of apps using Facebook Analytics had their weekly summary reports sent to their app’s testers, instead of only the app’s developers, admins, and analysts.

Testers are often people outside of a developer’s company. If the leaked info got to an app’s competitors, it could provide them an advantage. At least they weren’t allowed to click through to view more extensive historical analytics data on Facebook’s site.

Facebook tells us it plans to notify all impacted developers about the leak today. Below you can find the email the company will send:

Subject line: We recently resolved an error with your weekly summary email

We wanted to let you know about a recent error where a summary e-mail from Facebook Analytics about your app was sent to testers of your app ‘[APP NAME WILL BE DYNAMICALLY INSERTED HERE]’. As you know, we send weekly summary emails to keep you up to date with some of your top-level metrics — these emails go to people you’ve identified as Admins, Analysts and Developers. You can also add Testers to your account, people designated by you to help test your apps when they’re in development.

We mistakenly sent the last weekly email summary to your Testers, in addition to the usual group of Admins, Analysts and Developers who get updates. Testers were only able to see the high-level summary information in the email, and were not able to access any other account information; if they clicked “View Dashboard” they did not have access to any of your Facebook Analytics information.

We apologize for the error and have made updates to prevent this from happening again.

One affected developer told TechCrunch “Not sure why it would ever be appropriate to send business metrics to an app user. When I created my app (in beta) I added dozens of people as testers as it only meant they could login to the app…not access info!” They’re still waiting for the disclosure from Facebook.

The privacy mistake comes just weeks after a bug caused 14 million users’ Facebook status update composers to change their default privacy setting to public. And Facebook has had problems with misdelivering business information before. In 2014, Facebook accidentally sent advertisers receipts for other business’ ad campaigns, causing significant confusion. The company has also misreported metrics about Page reach and more on several occasions.

While Facebook has been working diligently to patch app platform privacy holes since the Cambridge Analytica scandal, removing access to many APIs and strengthening human reviews of apps, issues like today’s make it hard to believe Facebook has a proper handle on the data of its 2 billion users.

 


0

WordPress.com parent company acquires Atavist

19:47 | 22 June

Automattic, the company behind WordPress.com, WooCommerce, Longreads, Simplenote and a few other things, is acquiring Brooklyn-based startup Atavist.

Atavist has been working on a content management system for independent bloggers and writers. With an Atavist website, you can easily write and publish stories with a ton of media.

You might think that this isn’t particularly groundbreaking as anyone can create a website on WordPress.com or Squarespace and do the same thing. But the company also lets you create a paywall and build a subscription base.

Many writers don’t want to deal with the technical details of running a website. That’s why Atavist gives you the tools so that you can focus on your stories.

Atavist is also running a publication called Atavist Magazine. The publication is also joining Automattic. It’s unclear if it’s going to be part of Longreads or remain its own thing.

The CMS itself won’t stick around. Automattic said that the publishing platform will be integrated into WordPress. And this is the interesting part.

While WordPress is probably a much more solid CMS than Atavist, it could mean that Automattic wants to start offering subscriptions and paywalls. You can imagine WordPress.com websites that offer monthly subscriptions natively.

30 percent of the web runs on WordPress. Many of them are open source instances of WordPress hosted on their own servers. But many websites are hosted by WordPress.com, including TechCrunch.

Subscriptions on WordPress.com is good news for the web. Medium abruptly canceled its subscription program leaving many independent publications in the dust. So it’s hard to trust Medium when it comes to providing enough revenue to independent writers.

Automattic could create a seamless portal to manage subscriptions to multiple publications. And this could lead to less advertising and better content.

 


0

GitHub Education is now free for schools

20:00 | 19 June

GitHub, the code sharing and collaboration platform that Microsoft is acquiring, today announced that its GitHub Education suite of services is now available for free to any school that wants to use it to teach its students.

GitHub previously trialed this program with a few schools and is now making it widely available.

It’s worth noting that GitHub has long been available for free to individual students and teachers who want to use it in their classrooms. GitHub Education goes a step beyond this and offers schools access to GitHub Enterprise or Business Hosted accounts, as well as access to training, dedicated support for the school’s head of IT or CTO — and swag.

As part of this program, students also get access to the Student Developer Pack, which offers free access to tools and credits for services like Datadog, Travis CI and DigitalOcean.

To participate in this program, a school has to make GitHub available to all of its technical departments, make sure that its faculty and students receive regular announcements about the service and send one person from each department to GitHub’s Campus Advisors training.

 


0

App Maker, Google’s low-code tool for building business apps, comes out of beta

19:00 | 14 June

It’s been a year and a half since Google announced App Maker, its online tool for quickly building and deploying business apps on the web. The company has mostly remained quiet about App Maker ever since and kept it in a private preview mode, but today, it announced that the service is now generally available and open to all developers who want to give it a try.

Access to App Maker comes with any G Suite Business and Enterprise subscription, as well as the G Suite for Education edition. The overall idea here is to help virtually anybody in an organization — including those with little to no coding experience — to build their own line-of-business apps based on data that’s already stored in G Suite, Google’s Cloud SQL database or any other database that supports JDBC or that offers a REST API (that that’s obviously a bit more of an advanced operation).

[gallery ids="1656332,1656333,1656334"]

To do this, App Maker provides users with a low-code application development environment that lets you build applications through a straightforward drag and drop environment. Though it takes a bit of work to set up the database connectivity, once that’s done, the actual design part looks to be pretty easy — and thanks to a set of responsive templates, those final applications should work, no matter whether you are on a phone or desktop.

While many applications will likely rely on a database, it’s worth noting that developers can access Gmail, Google Calendar, Sheets and other data sources as well. In total, App Maker offers access to 40 Google Services. Unlike other low-code services like Mendix, K2 or even Microsoft’s PowerApps tools, Google’s App Maker seems to focus mostly on Google’s own services and doesn’t offer built-in connectivity with third-party services like Salesforce, for example. Chances are, of course, that now that App Maker is out of preview, Google will start adding more functionality to the service.

 


0

Amazon starts shipping its $249 DeepLens AI camera for developers

07:01 | 14 June

Back at its re:Invent conference in November, AWS announced its $249 DeepLens, a camera that’s specifically geared toward developers who want to build and prototype vision-centric machine learning models. The company started taking pre-orders for DeepLens a few months ago, but now the camera is actually shipping to developers.

Ahead of today’s launch, I had a chance to attend a workshop in Seattle with DeepLens senior product manager Jyothi Nookula and Amazon’s VP for AI Swami Sivasubramanian to get some hands-on time with the hardware and the software services that make it tick.

DeepLens is essentially a small Ubuntu- and Intel Atom-based computer with a built-in camera that’s powerful enough to easily run and evaluate visual machine learning models. In total, DeepLens offers about 106 GFLOPS of performance.

The hardware has all of the usual I/O ports (think Micro HDMI, USB 2.0, Audio out, etc.) to let you create prototype applications, no matter whether those are simple toy apps that send you an alert when the camera detects a bear in your backyard or an industrial application that keeps an eye on a conveyor belt in your factory. The 4 megapixel camera isn’t going to win any prizes, but it’s perfectly adequate for most use cases. Unsurprisingly, DeepLens is deeply integrated with the rest of AWS’s services. Those include the AWS IoT service Greengrass, which you use to deploy models to DeepLens, for example, but also SageMaker, Amazon’s newest tool for building machine learning models.

These integrations are also what makes getting started with the camera pretty easy. Indeed, if all you want to do is run one of the pre-built samples that AWS provides, it shouldn’t take you more than 10 minutes to set up your DeepLens and deploy one of these models to the camera. Those project templates include an object detection model that can distinguish between 20 objects (though it had some issues with toy dogs, as you can see in the image above), a style transfer example to render the camera image in the style of van Gogh, a face detection model and a model that can distinguish between cats and dogs and one that can recognize about 30 different actions (like playing guitar, for example). The DeepLens team is also adding a model for tracking head poses. Oh, and there’s also a hot dog detection model.

But that’s obviously just the beginning. As the DeepLens team stressed during our workshop, even developers who have never worked with machine learning can take the existing templates and easily extend them. In part, that’s due to the fact that a DeepLens project consists of two parts: the model and a Lambda function that runs instances of the model and lets you perform actions based on the model’s output. And with SageMaker, AWS now offers a tool that also makes it easy to build models without having to manage the underlying infrastructure.

You could do a lot of the development on the DeepLens hardware itself, given that it is essentially a small computer, though you’re probably better off using a more powerful machine and then deploying to DeepLens using the AWS Console. If you really wanted to, you could use DeepLens as a low-powered desktop machine as it comes with Ubuntu 16.04 pre-installed.

For developers who know their way around machine learning frameworks, DeepLens makes it easy to import models from virtually all the popular tools, including Caffe, TensorFlow, MXNet and others. It’s worth noting that the AWS team also built a model optimizer for MXNet models that allows them to run more efficiently on the DeepLens device.

So why did AWS build DeepLens? “The whole rationale behind DeepLens came from a simple question that we asked ourselves: How do we put machine learning in the hands of every developer,” Sivasubramanian said. “To that end, we brainstormed a number of ideas and the most promising idea was actually that developers love to build solutions as hands-on fashion on devices.” And why did AWS decide to build its own hardware instead of simply working with a partner? “We had a specific customer experience in mind and wanted to make sure that the end-to-end experience is really easy,” he said. “So instead of telling somebody to go download this toolkit and then go buy this toolkit from Amazon and then wire all of these together. […] So you have to do like 20 different things, which typically takes two or three days and then you have to put the entire infrastructure together. It takes too long for somebody who’s excited about learning deep learning and building something fun.”

So if you want to get started with deep learning and build some hands-on projects, DeepLens is now available on Amazon. At $249, it’s not cheap, but if you are already using AWS — and maybe even use Lambda already — it’s probably the easiest way to get started with building these kind of machine learning-powered applications.

 


0

Sumo Logic brings data analysis to containers

16:00 | 12 June

Sumo Logic has long held the goal to help customers understand their data wherever it lives. As we move into the era of containers, that goal becomes more challenging because containers by their nature are ephemeral. The company announced a product enhancement today designed to instrument containerized applications in spite of that.

They are debuting these new features at DockerCon, Docker’s customer conference taking place this week in San Francisco.

Sumo’s CEO Ramin Sayer says containers have begun to take hold over the last 12-18 months with Docker and Kubernetes emerging as tools of choice. Given their popularity, Sumo wants to be able to work with them. “[Docker and Kubernetes] are by far the most standard things that have developed in any new shop, or any existing shop that wants to build a brand new modern app or wants to lift and shift an app from on prem [to the cloud], or have the ability to migrate workloads from Vendor A platform to Vendor B,” he said.

He’s not wrong of course. Containers and Kubernetes have been taking off in a big way over the last 18 months and developers and operations alike have struggled to instrument these apps to understand how they behave.

“But as that standardization of adoption of that technology has come about, it makes it easier for us to understand how to instrument, collect, analyze, and more importantly, start to provide industry benchmarks,” Sayer explained.

They do this by avoiding the use of agents. Regardless of how you run your application, whether in a VM or a container, Sumo is able to capture the data and give you feedback you might otherwise have trouble retrieving.

Screen shot: Sumo Logic (cropped)

The company has built in native support for Kubernetes and Amazon Elastic Container Service for Kubernetes (Amazon EKS). It also supports the open source tool Prometheus favored by Kubernetes users to extract metrics and metadata. The goal of the Sumo tool is to help customers fix issues faster and reduce downtime.

As they work with this technology, they can begin to understand norms and pass that information onto customers. “We can guide them and give them best practices and tips, not just on what they’ve done, but how they compare to other users on Sumo,” he said.

Sumo Logic was founded in 2010 and has raised $230 million, according to data on Crunchbase. Its most recent round was a $70 million Series F led by Sapphire Ventures last June.

 


0

Automated dev platform CircleCI expands to Japan, first office outside U.S.

02:22 | 12 June

CircleCI is on something of a tear. The company’s continuous integration and deployment build platform is used by hundreds of thousands of developers around the world to create their own software. It has also received $59 million in venture capital funding, including a $31 million Series C earlier this year.

As it looks to continue growing, the company is expanding its global footprint. It has opened its first international office outside of its SF headquarters in Tokyo, Japan. As part of the opening, CircleCI is intending to eventually build an office of 4-5 employees and create partnerships with local companies.

The company has experience in the geography, with several remote workers stationed there. It’s also the third largest market for the company, after the United States and the United Kingdom, where it works with local companies like CyberAgent and DeNA.

“We are really excited about Japan, excited about global,” CEO Jim Rose explained. “Japan has been a market that has had its own momentum, and it has had speed that has picked up over the years.” Rose joined CircleCI as COO in 2014 through the company’s acquisition of Distiller, and became CEO in 2015.

CircleCI has had a bottoms-up sales model, where developers can install Circle on their infrastructure anywhere around the world. The company’s message has been heard widely, with roughly 35-40% of the company’s gross revenues coming from global customers according to Rose.

However, CircleCI’s product is not just click-and-install. It’s also a whole new way of managing software in a cloud-native environment, which means that developers and managers are increasingly needing to work together to migrate legacy codebases from old models to cloud and Git-native ones. “What we have seen over the last six quarters is that that practice is starting to embed itself in large enterprise,” Rose said.

However, that education, training, and cultural change has been tougher in non-English speaking markets like Japan. Rose says that once a company gets beyond the first step of installing a system like Circle, “there is another step of socializing the product inside of those companies,” and “those efforts require local knowledge.” The hope is that a dedicated, localized team designed to bridge that gap will help CircleCI cement its products in developers’ workflows.

While the U.K. is the second-largest market for the company, the company chose Japan to launch international expansion since it’s English language resources have proven adequate so far, and complications from Brexit make strategic planning in Europe more complicated.

“There are a lot of moving parts around Brexit and GDPR and whether you can approach them as a single market or multiple. At the very least, you have to approach the UK as its own market separate from the EU,” Rose explained. CircleCI is still determining the right way to setup its international expansion in Europe to encompass successful markets for the company like Germany, France, and the Nordic countries.

Rose sees the company eventually increasing the share of global revenues to 50%. Japan then is just the start of intensifying global expansion for the company.

 


0

Four years after release of Kubernetes 1.0, it has come long way

20:52 | 6 June

On June 6th, 2014 Kubernetes 1.0 was released. At the time, nobody could have predicted that 4 years later that the project would become a de facto standard for container orchestration or that the biggest tech companies in the world would be backing it. That would come later.

If you think back to June 2014, containerization was just beginning to take off thanks to Docker, which was popularizing the concept with developers, but being so early there was no standard way to manage those containers.

Google had been using containers as a way to deliver applications for years and ran a tool called Borg to handle orchestration. It’s called an orchestrator because much like a conductor of an orchestra, it decides when a container is launched and when it shuts down once it’s completed its job.

At the time, two Google engineers, Craig McLuckie and Joe Beda, who would later go on to start Heptio, were looking at developing an orchestration tool like Borg for companies that might not have the depth of engineering talent of Google to make it work. They wanted to spread this idea of how they develop distributed applications to other developers.

Hello world

Before that first version hit the streets, what would become Kubernetes developed out of a need for an orchestration layer that Beda and McLuckie had been considering for a long time. They were both involved in bringing Google Compute Engine, Google’s Infrastructure as a Service offering, to market, but they felt like there was something missing in the tooling that would fill in the gaps between infrastructure and platform service offerings.

“We had long thought about trying to find a way to bring a sort of a more progressive orchestrated way of running applications in production. Just based on our own experiences with Google Compute Engine, we got to see firsthand some of the challenges that the enterprise faced in moving workloads to the cloud,” McLuckie explained.

He said that they also understood some of the limitations associated with virtual machine-based workloads and they were thinking about tooling to help with all of that. “And so we came up the idea to start a new project, which ultimately became Kubernetes.”

Let’s open source it

When Google began developing Kubernetes in March 2014, it wanted nothing less than to bring container orchestration to the masses. It was a big goal and McLuckie, Beda and teammate Brendan Burns (and later Tim Hockin) believed the only way to get there was to open source the technology and build a community around it. As it turns out, they were spot on with that assessment, but couldn’t have been 100 percent certain at the time. Nobody could have.

Photo: Cloud Native Computing Foudation

“If you look at the history, we made the decision to open source Kubernetes and make it a community-oriented project much sooner than conventional wisdom would dictate and focus on really building a community in an open and engaged fashion. And that really paid dividends as Kubernetes has accelerated and effectively become the standard for container orchestration,” McLuckie said.

The next thing they did was to create the Cloud Native Computing Foundation (CNCF) as an umbrella organization for the project. If you think about it, this project could have gone in several directions, as current CNCF director Dan Kohn described in a recent interview.

Going cloud native

Kohn said Kubernetes was unique in a couple of ways. First of all, it was based on existing technology developed over many years at Google. “Even though Kubernetes code was new, the concepts and engineering and know-how behind it was based on 15 years at Google building Borg (And a Borg replacement called Omega that failed),” Kohn said. The other thing was that Kubernetes was designed from the beginning to be open sourced.

Photo: Swapnil Bhartiya on Flickr. Used under CC by SA 2.0 license

He pointed out that Google could have gone in a few directions with Kubernetes. It could have created a commercial product and sold it through Google Cloud. It could have open sourced it, but had a strong central lead as they did with Go. They could have gone to the Linux Foundation and said they wanted to create a stand-alone Kubernetes Foundation. But they didn’t do any of these things.

McLuckie says they decided to something entirely different and place it under the auspices of the Linux Foundation, but not as Kubernetes project. Instead they wanted to create a new framework for cloud native computing itself and the CNCF was born. “The CNCF is a really important staging ground, not just for Kubernetes, but for the technologies that needed to come together to really complete the narrative, to make Kubernetes a much more comprehensive framework,” McLuckie explained.

Getting everyone going in the same direction

Over the last few years, we have watched as Kubernetes has grown into a container orchestration standard. Last summer in quick succession  a slew of major enterprise players joined CNCF as AWSOracleMicrosoftVMware and Pivotal all joined. They came together with Red Hat, Intel, IBM Cisco and others who were already members.

Cloud Native Computing Foundation Platinum members

Each these players no doubt wanted to control the orchestration layer, but they saw Kubernetes gaining momentum so rapidly, they had little choice but to go along. Kohn jokes that having all these big name players on board is like herding cats, but bringing in them in has been the goal all along. He said it just happened much faster than he thought it would.

In a recent interview with TechCrunch, David Aronchick, who runs the open source Kubeflow Kubernetes machine learning project at Google, was running Kubernetes in the early days. He is shocked by how quickly it has grown. “I couldn’t have predicted it would be like this. I joined in January, 2015 and took on project management for Google Kubernetes. I was stunned at the pent up demand for this kind of thing,” he told TechCrunch.

As it has grown, it has become readily apparent that McLuckie was right about building that cloud native framework instead of a stand-alone Kubernetes foundation. Today there are dozens of adjacent projects and the organization is thriving.

Nobody is more blown away by this than McLuckie himself who says seeing Kubernetes hit these various milestones since its 1.0 release has been amazing for him and his team to watch. “It’s just been a series of these wonderful kind of moments as Kubernetes has gained a head of steam, and it’s been  so much fun to see the the community really rally around it.”

 


0

Microsoft program provides a decade of updates for Windows IoT devices

09:00 | 6 June

If you have an essential Internet of Things device running Windows 10 IoT Core Service, you don’t want to be worried about security and OS patches over a period of years. Microsoft wants to help customers running these kinds of devices with a new program that guarantees 10 years of updates.

The idea is that as third-party partners build applications on top of the Windows 10 IoT Core Services, these OEMs, who create the apps, can pay Microsoft to guarantee updates for these devices for a decade. This can help assure customers that they won’t be vulnerable to attack on these critical systems from unpatched applications.

The service does more than provide updates though. It also gives OEMs the ability to manage the updates and assess the device’s health.

“The Windows IoT Core service offering is enabling partners to commercialize secure IoT devices backed by industry-leading support. And so device makers will have the ability to manage updates for the OS, for the apps and for the settings for OEM-specific files,” Dinesh Narayanan, director of business development for emerging markets explained.

It gives OEMs creating Windows-powered applications on machines like healthcare devices or ATMs this ability to manage them over an extended period. That’s particularly important as these devices tend to have a more extended usage period than say a PC or tablet.”We want to extend support and commit to that support over the long haul for these devices that have a longer life cycle,” Narayanan said.

Beyond the longevity, the service also provides customers with access to the Device Update Center where they can control and customize how and when the devices get updated. It also includes another level of security called Device Health Attestation that allows the OEMs to evaluate the trustworthiness of the devices before they update them using a third party service.

All of this is designed to give Microsoft a foothold in the growing IoT space and to provide an operating system for these devices as they proliferate. While predictions vary dramatically, Gartner has predicted that at least 20 billion connected devices will be online in 2020.

While not all of these will be powered by Windows, or require advanced management capabilities, those that do can be assured if their vendor uses this program that they can manage the devices and keep them up-to-date. And when it comes to the Internet of Things, chances are that’s going to be critical.

 


0
<< Back Forward >>
Topics from 1 to 10 | in all: 699

Site search


Last comments

Walmart retreats from its UK Asda business to hone its focus on competing with Amazon
Peter Short
Good luck
Peter Short

Evolve Foundation launches a $100 million fund to find startups working to relieve human suffering
Peter Short
Money will give hope
Peter Short

Boeing will build DARPA’s XS-1 experimental spaceplane
Peter Short
Great
Peter Short

Is a “robot tax” really an “innovation penalty”?
Peter Short
It need to be taxed also any organic substance ie food than is used as a calorie transfer needs tax…
Peter Short

Twitter Is Testing A Dedicated GIF Button On Mobile
Peter Short
Sounds great Facebook got a button a few years ago
Then it disappeared Twitter needs a bottom maybe…
Peter Short

Apple’s Next iPhone Rumored To Debut On September 9th
Peter Short
Looks like a nice cycle of a round year;)
Peter Short

AncestryDNA And Google’s Calico Team Up To Study Genetic Longevity
Peter Short
I'm still fascinated by DNA though I favour pure chemistry what could be
Offered is for future gen…
Peter Short

U.K. Push For Better Broadband For Startups
Verg Matthews
There has to an email option icon to send to the clowns in MTNL ... the govt of India's service pro…
Verg Matthews

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short

CrunchWeek: Apple Makes Music, Oculus Aims For Mainstream, Twitter CEO Shakeup
Peter Short
Noted Google maybe grooming Twitter as a partner in Social Media but with whistle blowing coming to…
Peter Short