3 years ago in Things
We run these applications in cloud-native environments with virtual displays and stream the video directly to the browser while the user's input context (mouse, keyboard, etc) is forwarded to these applications in near real-time. The platform has a very generous free plan for all free applications and as we improve our cloud we intend to make all free applications fully free always.
3 years ago in Quotes
I live in Eastern Europe. A local city with a population of 300-400k was hit with a near total ransomware attack. The hackers asked for 400 bitcoin.
The mayor answered to them on TV "You fools, we still do most things on paper here ! We'll just spend the week-end installing windows and word and F** Y* !!!"
Personal data, access to it, the right to spy on millions of people, carried out by the private sector, is now being fought over by nation states. Younger generations should be up in arms over surveillance. Instead, at least in the US, they want to use these apps and work for these shameful companies. Older generations, who should know better, are willingly using Alexas and the like. Interesting times.
People blame developers but it's all driven by a product mentality that favors rapid iterations and technical debt to run business experiments on customers. Slow-and-steady, carefully written software isn't tolerated within many product orgs these days.
Apple values my blood sweat and tears at $0.99/user/lifetime - 30% Apple tax - government taxes.
I don't mean to compare it to a sweat shop, because I live in the first world and have opportunities, but this is a demeaning shakedown and devaluation of my pride, product, and work.
1. Shifted where generic computing happens
2. Downplayed the web as the end-all, be-all of application delivery. (It could have been amazing with WASM and sandboxing back in the 00's!)
3. Prevents generic apps from gaining distribution outside of Apple's control and tax
They took advantage of open source, the web, and the Internet. Then they shit on it and offered up the App Store protection racket as salvation.
It's only one of several themes where the giants of today crush the little guy. Computing is less free today than it was a decade ago.
Before Apple I had reach and distribution. Now I have less than 50% of that. And I don't have liberty and control over my own narrative anymore.
In order to refocus the Firefox organization on core browser growth through differentiated userexperiences, we are reducing investment in some areas such as developer tools, internal tooling, and platform feature development
I don't think the user population is aware how disingenuous all of this tech crap is. It could be so awesome, and they don't even understand what's not awesome about it. It hurts in a deep, emotional space.
I have found so much inspiration in some of the great programmers of two generations ago. The writings of Chuck Moore and Alan Kay convince me that we somehow took two orders of magnitude of backwards steps in creating the present milieu of dysfunctional technology.
The worst part, IMO, is that it's all opaque. I don't control the device that I hold in my hand. I can't fix it because Google or Apple don't want me to. It is a tool of economic and social control, not a powerful technology that I can wield.
.. we're just writing too much code. Companies have hundreds of millions of lines of code in production right now and nobody who knows how it works, and what we're doing is a kind of runaway train where we're just hiring more and more people to write more and more code, and trying to ramp education up to be able to produce more and more people, and - I can only say this from the sidelines, as I don't have a degree - we seem to, at least as per these individuals, be cheapening computer science at the (possibly indirect) behest of these businesses who aren't willing or able to step back and try things differently.
Nothing fucking works. Nothing. Turning it off and back on again isn't a cute ritual, it's the cornerstone of all modern electronics. Everything ships with zero day patches. My $3000 TV crashes when you navigate an OSD menu the wrong way. Not the unnecessary smart features that it shipped with - that I of course augmented with a separate $300 purchase - but the actual 'treat me like a display' menu.
I work for a SaaS company and just as if not more work goes in to deciding how we measure uptime as goes in to designing for it. "Well, no customer incidents were reported, so that doesn't count as being down", "We have 1 hour of scheduled maintenance every week, but we still achieved 99.99 uptime" - it's creative, I'll give them that.
We talk about the network being unreliable as if a 200km 28ghz link and a trunk connection in a data center are the same thing. It's unqualified, and unhelpful, and nobody really knows what they are doing.
We "dismantle" waterfall as if it's not the same type of people who misunderstood the original publication doing the same thing with every other methodology and fad. (If you have not read "the leprechauns of software engineering" yet, it's an interesting read and worth a little bit of your time).
My house is full of devices, my history is full of purchases, that are a disappointment. I can't remember the last time I went a single. god. damn. day. without the things that are suppose to be helping me misbehaving in some way. And the worst part, is many of them can't even be fixed. They will putter along, the occasional patch, until they lose the attention of some swim lane on a plan of record somewhere and become e-waste.
I have been programing since I was eight. It was the most obvious passion I have ever found in life, but it feels like we're stuck. The arguments all feel the same boring old rehashed ones from over the last 20 years, probably longer. I'm bored. Is anybody else just tired of it all? Everything is amazing and crappy at the same time.
Isn't it weird that we have entire companies like Intercom or Rasa whose value add is pushing automated, AI-driven "assistants" onto websites, and then the companies that buy into that entire value add and codebases hacked on by ML experts find that none of it even works better than how it was in 1998?