According to the event’s website, “Devs For Ukraine is a free, online engineering conference with the goal to raise funds and provide support to Ukraine.” The livestream will be available for viewing at 4 PM UTC time (12:00PM Eastern Time, 9:00AM Pacific Time).
The lineup of speakers is quite impressive, and it includes Dan Abramov of Meta, Cassidy Williams of Remote, Jessica Janiuk of Google, Jose Valim of Dashbit (who happens to be the same person who created the Elixir programming language), and more!
The organizers had already reached the goal of $50,000 raised, prior to this article being published, so let’s now see how much we can raise. Donate by going to https://www.devsforukraine.io/?modal=donate
The Bottom Line
Events like this are a great way to raise funds for Ukraine & Ukrainians currently fighting the invaders. In the light of the current events, NGOs functioning on the territory of this Eastern European country need all of the funds they can use.
ITMAGINATION helps Ukraine as well. We had donated 100,000 PLN to PAH, and we are committed to helping Ukrainians finding jobs, and helping them to settle in Poland.
Rust, the programming language for super fast, and super safe app development, received a 1.60.0 update last week. The update brought an improvement to the underlying LLVM compiler, and brought a way to collect more information on builds. We also saw the re-enablement of Incremental Builds, after the team disabled them due to bugs. Additionally, there is an impressive number of stabilized APIs. See the list here.
The Bottom Line
Rust is continuing to improve, and to fix any of its shortcomings. The road the language has already gone through impressive changes, e.g., adding support for async operations.
All improvements to the language are greatly beneficial for the blockchain developers, and embedded platforms developers the most, as they heavily rely on Rust. We can’t forget about all the companies using Mozilla Labs’s language on front-ends, such as Figma, and 1Password.
Elastic, the company most famous for their Elasticsearch and Kibana announced their partnership with Google Cloud, and eight other organizations. Quoting their blog post:
“Elastic is partnering with Google Cloud and eight other data organizations to launch the Data Cloud Alliance–an initiative to accelerate the pace of digital transformation for organizations around the world by expanding data accessibility, integrations for better insights and action.”
The Alliance’s goal is to “initiative to accelerate the pace of digital transformation.” GCP “will provide infrastructure, API, and integration support for all environments” while the rest of the members will “collaborate on new, common industry data models, processes, and platform integrations to increase data portability and reduce complexity associated with data governance and global compliance.”
The Bottom Line
Global compliance is becoming harder and harder, with many data protection frameworks deployed across the globe at the same time, and others in development. Mentioning only the two of them, namely EU’s GDPR, and California’s CCPA, is enough to make companies go crazy. An initiative aiming to reduce complexity in all of the process is great news for companies of all sizes, though benefitting the medium and the big ones the most.
Confluent, the company offering the one and only “re-architected [Apache] Kafka” announced a deepened “Strategic Partnership with Microsoft.” The goal of the alliance is to “reduce the operational burden of managing data streams in Azure and speed up real-time application development in the cloud.” Companies’ clients across industries together admit their clients need an easy way to move their data to the cloud, and that the ability to process data in real-time is essential to their businesses.
The Bottom Line
The cooperation between the two companies could indeed result in an improved real-time data generation. You may watch how the “Walgreens Boots Alliance”, a gigantic player on the US market, modernized its business thanks to the improved Apache Kafka.
That’s not all, when it comes to the topic of massive amounts of data. According to a report titled “Data Age 2025”, we may expect around 175 Zettabytes of data generated annually. That’s 175000000000000 Gigabytes (!). Of course, it’s hard to get the exact estimate, or even one that will prove to be somewhat accurate. Nevertheless, the amount of data we generate is enormous, and many companies will want their large (not as large as 175 ZB, of course) data to be processed in real-time. This is where Confluent’s offering might shine.