Tauri is a Rust-based toolkit allowing developers to build software for desktops operating systems. The apps that one builds with the toolkit are much, much smaller than the apps built with, e.g., Electron.
Tauri recently reached the stable version 1, which was a long time in the making. This effectively means, that we, with an honest heart, may recommend using the toolkit for production.
The Bottom Line
Tauri started out with a strong position on the market. The solution is incredibly well-liked among developers – it took first place in the 2021 edition of State of JS in the category of mobile, and desktop frameworks in terms of satisfaction. Big companies did not ignore it, and even companies such as Microsoft or Alibaba are giving it a try. It’s leaner, faster, more secure, and outputs apps that are roughly 10x smaller than its competitors.
Smaller app bundles are something we want to highlight. Generally, the smaller the apps are, the faster they may be (a bit of a simplification). The smaller they are, the less carbon their transmission emits.
The project co-founder prepared a table comparing the environmental impact of transporting apps of different sizes across the internet. We will not comment the values any further. We will only let you know, that on average, apps built using Electron.js are roughly 100-200 Mb large.
Nokia is bringing back iconic phone styles from the late 90s and early 00s, providing an option "for people seeking to escape technology fatigue and 'always on' culture."
First on the list is the Nokia 8210 4G, an upgraded version of the same model released in 1999. It's your classic Nokia phone with access to 4g and probably an enhanced version of Snake. The main selling point is the long-forgotten convenience of weeks of standby battery time.
Next on the list is one for the audiophiles, the Nokia 5710 XpressAudio is an audio-centric phone with dedicated audio controls, enhanced speakers, and the most intriguing part, built-in charging for wireless earbuds that come with the phone. For the active individual, this might be much more convenient to carry on your arm than a massive smartphone.
Last on the list is probably the most sought-after phone style if you were around during the early 2000s. Yes, that's right, the flip phone is back. It looks to be designed for the over 50 crowd, which seems about right on the nostalgia timeline. It has a big display, hearing aid compatibility, enhanced accessibility features, and big buttons to provide a seamless experience for those who remember the Y2K scare.
The Bottom Line
Upgrading some solid designs with new tech is not a bad strategy. Nokia did make some great phones in the past. A few years ago, Nokia released an upgraded 3310, which everyone knows will survive us all. That single phone was also a catalyst for customizing your handheld device with various custom faceplates you can change on your 3310.
There is a movement, albeit small, to break away from the smartphone. With the expanded creepiness of tracking tech, data brokers, and notification overload, some would prefer a phone that just works and doesn't need to be charged 3 times a day. So, for those who wish to simplify their phone, these options don't look half bad.
Virtual machines are a great way to perform computing tasks for various reasons – whatever it might be. Amazon recently introduced their EC2 instances with the Apple M1 processor. More precisely, they are just Mac Minis sitting in a data warehouse available to you.
You will be able to use them as remote computers, thanks to Apple Remote Desktop. The M1 instances are available in two regions, only: US East (N. Virginia) and US West (Oregon). “[T]here is a minimum charge of 24 hours for reserving a dedicated host. […] In the two preview Regions, the on-demand price is $0.6498 per hour.”
The Bottom Line
The Apple M1 instances are a great way to develop for the Apple platforms without the costly upfront investment of purchasing their device. As the Apple policy stands, you may not develop apps for the ecosystem without opting into it, fully or partially. At the very least, it’s not the easiest task.
Connecting to a Mac that you do not have to purchase is a perfect solution. One does not have to make a relatively large investment, and still develop iOS, or macOS apps, in peace.
Apart from that, the dominating language in the codebase is Zig; a fresh, low level programming language. The major difference between Zig and other programming languages is the lack of the “hidden control flow”, and no “hidden allocations” while remaining a simple language (as project maintainers claim). If you are using C++/C, you may try the new language as a “drop-in C/C++ compiler.”
Watch a video by Fireship for a quick rundown of Bun.
The Bottom Line
Bun authors had three things in mind:
The two first goals seem to be achieved. If we are to believe benchmarks, then we have a true beast on our hands. Bun is few times faster than Node or Deno.
It also seems to have batteries included. It has got:
We have a fascinating innovation on our hands, and we can’t wait to see how the project will advance.
GPT-3 is an impressive feature of modern data science. Just watch the video below:
To recuperate the funds required to create the project, OpenAI teamed up with, e.g., Microsoft, to create commercial projects. They do not share the details on what data was used to train the model, or the code.
Another model, BLOOM (“BigScience Large Open-science Open-access Multilingual Language Model”) took another approach, entirely. Not only is it bigger than GPT-3. It’s also as transparent as possible. Researchers shared the data BLOOM was trained on. On top of that, if you wish to download the model, you may do so by going to https://huggingface.co/bigscience/bloom.
The Bottom Line
The world of AI is becoming deeply entrenched and divided. The biggest enterprises are developing their models, with others not having any access to whatever they develop. BLOOM is a direct answer to that, being hosted on a publicly accessible website - and available for free.
Furthermore, environmental concerns are not an issue. The supercomputer used to train the model mostly used nuclear energy, and the heat generated was reused for heating internal areas in the facility to offset the impact.
The project, financed by the French government, is a phenomenal answer to Google’s LaMDA and GPT-3. It’s free, and we know what data was used to train the model. Furthermore, in the cases of Arabic and Spanish, it’s the first model of this size.