s
Português English

Web

2022.11.26

Owner of Social Network Data aka Indieweb

During the campaign for Congress, I had to transform my social networks into a real sales platform. They all had to be super aligned with the same purpose.

I should have already installed/hired some social media manager, like Hootsuite or Zoho Social (Zoho has been my online service provider for years), as this is no trivial task. There were more than 8 networks. And many of the contents are copies of each other.

In addition to the difficulty of managing the various networks, there is confusion as to what content I would release as official. Canonical. Especially networks that are essentially competitors.

  • Twitter or Mastodon?
  • Tiktok or YouTube Shorts or Instagram Stories?
  • Facebook or my blog?

It gets substantially worse with stories like Elon Musk’s takeover of Twitter. He’s made so many changes to the platform that it’s not impossible to think the company will eventually go broke. Thus, years of content would be thrown away. And the constant changes in rules and permissions?!

Indieweb: The owner of the content is ME

To address some of these concerns, I’m trying to centralize the source of information to a system that I have full control over. And nothing better than this site itself to be responsible. Here I do whatever I want, optimize images (one of my concerns is that I’ve never had much discipline in removing the metadata from images), and customize their appearance. This then becomes the official center of what I do.

POSSE is the practice of Publish (on your) Own Site, Syndicate Elsewhere, in other words, publish links or copies on other social networks always citing the source of the content, so that anyone can follow you directly at the source.

Reposting on Twitter and Mastodon is easy as they are usually text and few images. Meta/Facebook sites are more boring because they are richer in content and have no API to automate. Video ones are even more work, as hosting videos in person is quite expensive (I’ve always felt that YouTube does an almost humanitarian job in hosting such a volume of data).

For now, I must keep old content on its source platforms. Gradually I will try to write only here. Eventually start to even export all the old content from these services to have back and put static on the site.

Microformats, Fediverse and Webmention

To adapt the site to be the center of the online universe, some changes need to be made:

✅ Make it easier for computers to also understand the site

I also implemented microformats on the site and in the contents, so that any other system that reads the site can extract the main information: the author, title, content, and publication date. Several of this information already appears visually on the site. As humans, we can understand easily, but computers cannot. Therefore, a series of modifications were made so that the contents are also easily understood by machines.

Separating the different types of content

As I use the blog as a tool for longer texts, daydreams, and ideas, I’m thinking of creating specific lists for small texts (tweet/toot) and maybe images (today the site has a tag that points to my posts of photos). So everything would be better indexed and found. Blog and notes. My site manager, Hugo, allows for several approaches. The question is how to do a better implementation.

Integrate comments, replies, and shares made on other sites

I’ve tried using some commenting tools before, like Discus and Cactus.chat (super cool concept of using Matrix as a comment source). I don’t have a lot of traffic here, so it wouldn’t matter. But the goal is now bigger: to include comments and reposts of my content made on other sites.

The W3C’s standard (the organization that standardizes the internet) created webmentions, a way to formalize that someone is commenting on someone else’s content. That’s the only way I can maintain a great discussion about the content I’ve posted on any network.

“Automatically” republish content written here on other social networks

This will take some time as I will need to use some external services that will read my site’s RSS and try to post on social media instead. Essentially using a HootSuite/Zoho Social type solution. Even better if it’s open source. I will investigate the use of n8n.


As soon as I manage to implement more things, I’ll post them here. I want to give the least amount of work to the next ones who are excited to take control of their own digital lives.

2022.03.27

Multilanguage Site

brunomassa.com has been always an international site. Almost all posts were written in English. But now that I will venture into politics (more about this in a later post), it’s worth separating Brazilian Portuguese content apart. Mixing posts about programming, gaming, and movies will only confuse followers and voters.

Instead of removing non-related content, I decided to split languages. The Brazilian Portuguese edition will show up more posts about the Brazilian scene. Curiously, most of the old posts written in pt-br were already about politics or football, so they are already fit for the job!

I do not know how to create a hybrid model using Hugo, with most language-independent content intertwined. If I find it, I will enable it in the future. It will be especially useful for hot-pages, those pages that serve as an entry for promotions and special situations. It would not be good to create a hot-page for brunomassa.com/pt-br/hot-page, because it would defeat its purpose of being easy to remember and share. To help even further, I’ve just bought the brmassa.com domain, aligning with other social media usernames.

So, I’m going to start to work towards generating more content about politics and the Brazilian context. This week I plan to make the now-not-much-a-surprise announcement.

This very post is multi-language. Click on the small flags to switch.

2021.09.24

Rating Badge

As a programmer and businessman, I try to organize the world. So, I created a unified Rating page consolidating all reviews that I did. Games, board games, books, movies, and TV shows.

For a few of them, I wrote a full blog post. But most of I did not. That was driving me crazy. I often mention the same games/movies on multiple posts. When it happens to a piece of art that I did not previously review, I felt pressure to do so. I even might do so, but now it’s not required anymore. Now the non-reviewed-but-rated are properly acknowledged. And I shall have consistency.

I’m going to scan, in the next few days, all previous blog posts to cross reference, but the main step was done.

⭐⭐⭐⭐⭐

Also, in a similar vein to the previous post, Rating Art, I decided to give my ratings a more visual appeal. For now, besides the numeric 0-10 rating, it will show the according to the number of stars.

2021.09.03

Rating Art

Rating things is a real art. Especially if we are rating art. Not much thought is put on it; eventually things start to get complicated and ambiguous.

Time

Also cultural references also change. What was good 100 years ago might simply be unacceptable nowadays. There are plenty of movies, sculptures, paintings and songs that portrait racism, misogyny or prejudice that were normal at the time. It’s complicate to reevaluate them using our modern mental framework.

Also, our own taste changing with time. Things that were cool when we were young might embarrassing years later. #cringe

Technology

Some technological improvements make it change our quality perspective. A silent or black-and-white movie, a radio quality song recording, an Atari Pong. But today, it’s hard sell to have such limitation in a modern piece of art.

Sometimes, these technological changes make plainly impossible to appreciate the art later on. For video games it’s particularly affected, since the medium in which it is consumed is part of the experience. Virtual Boy headaches during hours and hours of playtime were part of the nostalgia, but how to compare with a modern XR game if the hardware itself is hard to find and make it work?

Single Fixed Scale

Finally, we have to reduce all the rich details into a numeric scale.

I prefer an infinite positive scale, that always grows with new titles, would be better. So Pong would never be in the same league as a modern AAA 3D adventure story-driven game. But at the same time, one could honestly appreciate an old movie almost the same as flashy new one.

So having a single fixed scale, from 1-5, 0-10, percentage, or even the super weird American F-A concept, is an easier way to deal things. Almost everyone uses this in some shape or form.

My take

There are much to discuss.

At least for now, I’m going to simplify a bit my ratings. I use a 0-10 scale, with .5 decimals. There is no need for these decimal point. An 0-10 scale is enough to separate good from bad. Numerically, 9.4 is better than 9.3. But in practice, it most convey the information that is an amazing game/movie/book, not that one is better than the other. The details I expect to point are a qualitative analysis in each review.

Also, using half-points in practice doubles the range. It’s, in fact, a 20 point scale. No need for such granularity.

Updating all these past ratings with decimal points, rounding them up or down, depending each case.

One might notice that I’ve never used the 1-3 ratings and barely used bellow 6. It’s not a problem with the scale per se. It’s more about the selection process that occur before consuming a game or movie. I try to focus on award winning, previously mentioned and commented by someone else before. I might eventually rethink this scale to englobe all bellow threshold in a single category and focus on the above threshold scale.

This way I tend to consume only reasonably good products and, therefore, only set reasonably good ratings! Good for me, if you ask.

2021.08.11

Hugo Images Processing

Hugo static website creating is a fantastic tool and I told you before. Since I changed to it, I’m very confident that the site is fast and responsive.

However, my site is packed full of images. Some are personal. Some are huge. Some are PNGs and some are JPGs. I created a gallery component just to handle posts that I want to fill with dozens of them.

Managing posts images is a boring task. For every post, I have to check:

  • Dimension
  • Compression
  • EXIF metadata
  • Naming

Dimension

Hugo images processing size 2.jpg

Having a bigger image than the size of the screen is useless. It’s a bigger file to download, consuming bandwidth from both the user and from the server. Google Lighthouse and other site metric evaluators all recommend resizing the images to at most the screen size.

In Hugo, I’ve automated using some functions:

{{ $image_new := ($image.Resize (printf "%dx" $width)) }}

Compression

Loss compression comparison.png

My personal photos are, most of the time, taken in JPEG. Recently I changed the default compression to HEIC for my phone camera, that provides better compression to hi-resolution photos. The web, however, does not allow such format.

Some pictures used to illustrate the posts are PNG. They have better quality at the expense of being larger. Mostly only illustrations and images with texts are worth to have a lossless format.

Whatever the format, I would like to compress as much as possible to waste less bandwidth. I’m currently inclined to use WebP, because it can really shrink the final size to a considerable amount.

{{ $image_new := ($image.Resize (printf "%dx webp" $width)) }}

EXIF metadata

Each digital image have a lot, and a mean A LOT, of metadata embedded inside the file. Day and time when it was taken, camera type, phone name, even longitude and latitude might also be included by camera app. They all reveal personal information that was supposed to be hidden.

In order to share them in the open public internet, it is important to sanitize all images, stripping then all this information. Hugo do not carry these infos along when it generates new images. So, for all images get a minimal resize, this matter is handled by default.

Naming

I would like to have a well organized image library, and it would be nice to standardize the file names. Using the post title to rename all images would be great, even more if used some caption of user provided description.

However, Hugo does not allow renaming them. To make matters even worse, it appends to each file name a hash code. A simple picture.jpeg suddenly became picture-hue44e96c7fa2d94b6016ae73992e56fa6-80532-850x0-resize-q75-h2_box.webp.

An incomprehensible mess. If you know a better way, let me know.

So What?

So, if most of the routines can be automated, that’s the problem?

The main issue is that Hugo have to pre-process ALL images upfront. As mentioned in the previous post, it can take a considerable amount of time. Especially if converted to a demanding format to compute such as WebP.

Netlify is constantly reaching the time limit to build the site, all because the thousands of image compressions. I am planning to revert some commits that I implemented WebP and rewrite them little by little, allowing Netlify to build a version a cache the results.

There are some categories of images:

  • Gallery full-size images: there are hundreds of them, it would take a lot of the processing time, but I will have the metadata extracted from the originals. The advantage is that they are rarely clicked and served.
  • Gallery thumbnails: the actual images that are shown on gallery mode. They are accountable of the biggest chunk of the main page overall size when a gallery is in the top 10 latest posts.
  • Post images: images that illustrate each article. They are resized to fit the whole page, so when compressed they represent a nice saving.
  • Post top banner: some posts have a top image. They are cropped to fit a banner-like size, so they are generally not that big.

I will, in the next couple of hours, try to implement the WebP code on each of these groups. If successfully completed, it will save hundreds of megabytes in the build.

Bonus Tip

Hugo copies all resources (image, PDF, audio, text, etc.) from the content folder to the final public/ build. Even if you only use the resized ones. Not only the build becomes larger, but the images that you wanted to hide the metadata is still online, there. Even if not directly pointed in the HTML.

A tip for those that are working with Hugo with a lot of images processed: use the following code into the content front-matter to instruct Hugo to not include these unused resources in the final build.

cascade:
  _build:
    publishResources: false

Let’s build.

Edit on 2021-08-25

I discovered that Netlify has a plugin ecosystem. And one of the plugins available is a Hugo caching system. It would speed up drastically the build times, as well the possibility of converting to WebP all images once and for all. I will test this feature right now and post the results later.

Edit on 2021-09-13

The plugin worked! I had to implement it using file configuration instead the easy one-click button. Building time went from 25 minutes to just 2. The current cache size is about 3.7 GB, so totally understandable.

It will allow me to must more frequent updates. Ok, to be frank: it will not restrict the posting frequency. However, patient, inspiration and focus are still the main constrains on blogging.

On netlify.toml file on root, I added:

# Hugo cache resources plugin
# https://github.com/cdeleeuwe/netlify-plugin-hugo-cache-resources#readme
[[plugins]]
package = "netlify-plugin-hugo-cache-resources"

[plugins.inputs]
# If it should show more verbose logs (optional, default = true)
debug = true
Bruno MASSA