Hacker Newsnew | past | comments | ask | show | jobs | submit | jhasse's commentslogin

Signal is so much worse than WhatsApp from a UX perspective. Backup sync forces you to allow background permissions (WhatsApp doesn't), you have to set and get nagged to enter a PIN every few weeks (WhatsApp doesn't), there's no transcription for audio messages (WhatsApp has that for some languages), the desktop app loses its connection if you don't open it ever few weeks (WhatsApp works fine), etc.

If you want people to switch, recommend Telegram.


>If you want people to switch, recommend Telegram.

Why would people switch from always-end-to-end encrypted group chats to never-end-to-end encrypted group chats?


Because they don't even know what e2e encryption is.

Yes. Let's switch to an app with Russian connections that has actively refused to implement E2EE for over a decade now.

Russian connections is FUD and Telegram has E2EE encryption, but not by default.

Said E2EE is mobile only and completely unavailable in group chats

You are moving the goal post. But you're right: Signal's E2EE is miles better than telegram's. I was just trying to point out my experience in getting people to switch, most of the time they have different prioirities.

My circle switched to Signal because we are concerned about tech bros and a fascist America.

Boosting Russia is not the solution.


Telegram is not Russian. In fact Putin hates Pavel Durov.

That's where the standard should come in and say something like "starting with C++26 char is always 1 byte and signed. std::string is always UTF-8" Done, fixed unicode in C++.

But instead we get this mess. I guess it's because there's too much Microsoft in the standard and they are the only ones not having UTF-8 everywhere in Windows yet.


char is always 1 byte. What it's not always is 1 octet.

you're right. What I meant was that it should always be 8 bit, too.

std::string is not UTF-8 and can't be made UTF-8. It's encoding agnostic, its API is in terms of bytes not codepoints.

Of course it can be made UTF-8. Just add a codepoints_size() method and other helpers.

But it isn't really needed anyway: I'm using it for UTF-8 (with helper functions for the 1% cases where I need codepoints) and it works fine. But starting with C++20 it's starting to get annoying because I have to reinterpret_cast to the useless u8 versions.


First, because of existing constraints like mutability though direct buffer access, a hypothetical codepoints_size() would require recomputation each time which would be prohibitively expensive, in particular because std::string is virtually unbounded.

Second, there is also no way to be able to guarantee that a string encodes valid UTF-8, it could just be whatever.

You can still just use std::string to store valid encoded UTF-8, you just have to be a little bit careful. And functions like codepoints_size() are pretty fringe -- unless you're not doing specialized Unicode transformations, it's more typical to just treat strings as opaque byte slices in a typical C++ application.


Perfect is the enemy of good. Or do you think the current mess is better?

std::string _cannot_ be made "always UTF-8". Is that really so contentious?

You can still use it to contain UTF-8 data. It is commonly done.


I never said always. Just add some new methods for which it has to be UTF-8. All current functions that need an encoding (e.g. text IO) also switch to UTF-8. Of course you could still save arbitrary binary data in it.

> That's where the standard should come in and say something like "starting with C++26 char is always 1 byte and signed. std::string is always UTF-8" Done, fixed unicode in C++.

> I never said always

Yes you totally did. And regarding "add some new methods for which it has to be UTF-8", there is no need at all to add UTF-8 methods to std::string. It would be a bad idea. UTF-8 is not bound to a particular type (or C++ type). It works on _any_ byte sequence.


Visual Studio is mostly written in C# btw.

Back in 2005 it was mostly in C++ and it was blazing fast. IMHO VS 2005 was the most performant edition. I never liked VS 2003, felt bloated in comparison.

Let's also not forget one big reason VSCode took over and Sublime lost: VSCode is gratis and (mostly) open-source, while Sublime is proprietary.

On macOS too. On both operation systems 99% apps do though. Maybe its 99.9% on macOS vs 99.8% on Windows. But I'm using HiDPI on both and it was a long time ago that I encountered an app that didn't support it.

Double click on the edge on Windows :)

Nice! Just works in top and bottom edge

It can be: What definition to jump to if there are multiple (e.g. multiple Translation Units)? What if the function is overloaded and none of the types match?

With grep it's easy: Always shows everything that matches.


Sure, there might be multiple definitions to jump to.

With grep you get lots of false positives, and for some languages you need a lot of extra rules to know what to grep for. (Eg in Python you might read `+` in the caller, but you actually need to grep for __add__ to find the definition.)


I disagree that Mercurial was better. For example Git was always much faster.


I remember that: for repos ~ 1Gb Mercurial just became unusable. It took half of day just to clone the repo.


That sounds like y'all may have been storing binary blobs and large files with out the right plugins setup.


Just rename root directory of project and double size of your repo Mercurial didn't supported rename, and did delete/add instead, so size of repo grows pretty fast


GNOME still has some problems with fractional scaling, but KDE works perfectly. I'm using two displays, one with 150% and one with 100%. No blurry apps and absolutely no issues. Have you tried it recently?


Can you independently set desktop wallpapers on the two screens? I know this seems nitpicky but it's literally impossible with Ubuntu/Gnome as far as I know; I have one vertical and one horizontal and have to just go with a solid color background to make that work.


Yes. It was actually more tedious to do the inverse when I wanted three screens to do a rotating wallpapers from the same set of folders as I had to set the list of folders three times


KDE is in better shape than GNOME, but there are still some nits. Nearly all the available third party themes for example are blurry or otherwise render incorrectly with fractional scaling on.


Still better than macOS and Windows where third-party theming is basically non-existant.

And we were talking about why Linux wasn't an alternative to macOS, weren't we?


So don't use a third party theme.


Problem is, the stock themes aren't to my taste at all.


Why not send a pull request to one of your theme maintainers?


To my understanding, doing that wouldn't be helpful due to hard technical limits that can't be reconciled. Most window chrome themes are Aurora themes, which don't play nice with HiDPI, and to change that they'd need to be rewritten as C++ themes (like the default Breeze theme is), which is beyond the capabilities of most people publishing themes.


Did you try Klassy?[1]

[1] https://github.com/paulmcauley/klassy


I have not, looks high quality though.


That’s not a KDE issue though, blame the themes


I've been using fractional scaling on Gnome for years (including on the laptop I'm typing this on) and haven't had any issues. I haven't tried it with two displays that are set differently though. Is that a common thing?


Open an X11 app and it will be blurry.

Also fractional scaling is not supported out-of-the-box in GNOME, you have to set a config value to use it IIRC.


inline and ; are redundant


> inline and ; are redundant

One of my s/w engineering axioms is:

  Better to express intent than assume a future
  reader of a solution, including myself, will
  intrinsically understand the decisions made.
If this costs a few extra keystrokes when authoring an implementation, so be it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: