PortuguêsEnglish

Web

maxresdefault-1994702608.jpg
2022.11.26

Owner of Social Network Data aka Indieweb

During the campaign for Congress, I had to transform my social networks in a real sales platform. They all had to be super aligned with the same purpose.

I should have already installed/hired some social media manager, like Hootsuite or Zoho Social (Zoho has been my online service provider for years) as this is no trivial task. There were more than 8 networks. And many of the contents are copies of each other.

In addition to the difficulty of managing the various networks, there is confusion as to what content I would release as official. Canonical. Especially networks that are essentially competitors.

  • Twitter or Mastodon?
  • Tiktok or Youtube Shorts or Instagram Stories?
  • Facebook or my blog?

It gets substantially worse with stories like Elon Musk’s takeover of Twitter. He’s made so many changes to the platform that it’s not impossible to think the company will eventually go broke. Thus, years of content would be thrown away. And the constant changes in rules and permissions?!

Indieweb: The owner of the content is ME

To address some of these concerns, I’m trying to centralize the source of information to a system that I have full control over. And nothing better than this site itself to be responsible. Here I do whatever I want, optimize images (one of my concerns that I never had much discipline was taking the metadata from images), customize their appearance. This then becomes the official center of what I do.

POSSE is the practice of Publish (on your) Own Site, Syndicate Elsewhere, in other words, publish links or copies on other social networks always citing the original source of the content, so that anyone can follow you directly at the source.

Reposting on Twitter and Mastodon is easy as they are usually text and few images. Meta/Facebook sites are more boring because they are richer in content and have no API to automate. Video ones are even more work, as hosting videos in person is quite expensive (I’ve always felt that Youtube does an almost humanitarian job in hosting such a volume of data).

For now, I must keep old content on its source platforms. Gradually I will try to write only here. Eventually start to even export all the old content from these services to have back and put static on the site.

Microformats, Fediverso and Webmention

To adapt the site to be the center of the online universe, some changes need to be made:

✅ Make it easier for computers to also understand the site

I also implemented microformats on the site and in the contents, so that any other system that reads the site can extract the main information: the author, title, content, publication date. Several of this information already appear visually on the site. As humans, we can understand easily, but computers cannot. Therefore, a series of modifications were made so that the contents are also easily understood by machines.

Separating the different types of content

As I use the blog as a tool for longer texts, daydreams and ideas, I’m thinking of creating specific lists for small texts (tweet/toot) and maybe images (today the site has a tag that points to my posts of photos. So everything would be better indexed and found. Blog and notes. My site manager, Hugo, allows for several approaches. The question is how to do the better implementation.

Integrate comments, replies and shares made on other sites

I’ve tried using some commenting tools before, like Discus and Cactus.chat (super cool concept of using Matrix as a comment source). I don’t have a lot of traffic here, so it wouldn’t matter. But the goal is now bigger: to include comments and reposts of my content made on other sites.

The W3C’s own standard (the organization that standardizes the internet) created webmentions, a way to formalize that someone is commenting on someone else’s content. That’s the only way I can maintain a great discussion about the content I’ve posted on any network.

“Automatically” republish content written here on other social networks

This will take some time as I will need to use a number of external services that will read my site’s RSS and try to post on social media instead. Essentially using a HootSuite/Zoho Social type solution. Even better if it’s open source. I will investigate the use of n8n.


As soon as I manage to implement more things, I’ll post them here. I want to give the least amount of work to the next ones who are excited to take control of their own digital lives.

multilanguage.jpg
2022.03.27

Multilanguage Site

brunomassa.com has been always an international site. Almost all posts were written in English. But now that I will venture in the politics (more about this in a later post), it’s worth to separate Brazilian Portuguese contents apart. Mixing posts about programming, gaming and movies will just confuse followers and voters.

Instead removing non-related contents, I decided to just split languages. In Brazilian Portuguese edition will show up more posts about Brazilian scene. Curiously, most of old posts written in pt-br were already about politics or football, so they are already fit for the job!

I do not know how to create a hybrid model using Hugo, with most language-independent content intertwined. If I find, I will enable it in the future. It will be specially useful for hot-pages, those pages that serve as a entry for promotions and special situations. It would not be good to create a hot-page for brunomassa.com/pt-br/hot-page, because it would defeat it’s purpose of easy to remember and share. To help even further, I’ve just bought the brmassa.com domain, aligning with other social media usernames.

So, I’m going to start to work towards generating more content about politics and Brazilian context. This week I plan to make the now-not-much-a-surprise announcement.

This very post is a multi-language. Click in the small flags to switch.

bad-review-rating-two-star-ss-1920.jpg
2021.09.24

Rating Badge

As a programmer and businessman, I try to organize the world. So, I created a unified Ratings page consolidating all reviews that I did. Games, boardgames, books, movies and TV shows.

A few of them I wrote a full blog post. But most I did not. That was driving me crazy. I often mention the same games/movies on multiple posts. When it happens to a piece of art that I did not previously reviewed, I felt a pressure to do so. I even might do so, but now it’s not required anymore. Now the non-reviewed-but-rated are properly acknowledged. And I shall have the consistence.

I’m going to scan, in the next days, all previous blog posts to cross reference, but the main step was done.

⭐⭐⭐⭐⭐

Also, in a similar vain the previous post, Rating Art, I decided to give my ratings a more visual appeal. For now, beside the numeric 0-10 rating, it will show the according number of stars.

bad-review-rating-two-star-ss-1920.jpg
2021.09.03

Rating Art

Rating things is a real art. Specially if we are rating art. Not much thought is put on it; eventually things start to get complicated and ambiguous.

Time

Also cultural references also change. What was good 100 years ago might simply be unacceptable nowadays. There are plenty of movies, sculptures, paintings and songs that portrait racism, misogyny or prejudice that were normal at the time. It’s complicate to reevaluate them using our modern mental framework.

Also, our own taste changing with time. Things that were cool when we were young might embarrassing years later. #cringe

Technology

Some technological improvements make it change our quality perspective. A silent or black-and-white movie, a radio quality song recording, an Atari Pong. But today, it’s hard sell to have such limitation in a modern piece of art.

Sometimes, these technological changes make plainly impossible to appreciate the art later on. For video games it’s particularly affected, since the medium in which it is consumed is part of the experience. Virtual Boy headaches during hours and hours of playtime were part of the nostalgia, but how to compare with a modern XR game if the hardware itself is hard to find and make it work?

Single Fixed Scale

Finally, we have to reduce all the rich details into a numeric scale.

I prefer an infinite positive scale, that always grows with new titles, would be better. So Pong would never be in the same league as a modern AAA 3D adventure story-driven game. But at the same time, one could honestly appreciate an old movie almost the same as flashy new one.

So having a single fixed scale, from 1-5, 0-10, percentage, or even the super weird American F-A concept, is a easier way to deal things. Almost everyone uses this in some shape or form.

My take

There are much to discuss.

At least for now, I’m going to simplify a bit my ratings. I use a 0-10 scale, with .5 decimals. There is no need for these decimal point. An 0-10 scale is enough to separate good from bad. Numerically, 9.4 is better then 9.3. But in practice, it most convey the information that is an amazing game/movie/book, not that one is better then the other. The details I expect to point are a qualitative analysis in each review.

Also, using half-points in practice doubles the range. It’s in fact a 20 point scale. No need for such granularity.

Updating all these past ratings with decimal points, rounding them up or down, depending each case.

One might notice that I’ve never used the 1-3 ratings and barely used bellow 6. It’s not a problem with the scale per se. It’s more about the selection process that occur before consuming a game or movie. I try to focus on award winning, previously mentioned and commented by someone else before. I might eventually rethink this scale to englobe all bellow threshold in a single category and focus on the above threshold scale.

This way I tend to consume only reasonably good products and, therefore, only set reasonably good ratings! Good for me, if you ask.

Hugo Images Processing mola.jpg
2021.08.11

Hugo Images Processing

Hugo static website creating is a fantastic tool and I told you before. Since I changed to it, I’m very confident that the site is fast and responsive.

However, my site is packed full of images. Some are personal. Some are really big. Some are PNGs and some are JPGs. I created a gallery component just to handle posts that I want to fill with dozens of them.

Managing posts images is a boring task. For every post, I have to check:

  • Dimension
  • Compression
  • EXIF metadata
  • Naming

Dimension

Hugo images processing size 2.jpg

Having a bigger image than the size of the screen is useless. It’s a bigger file to download, consuming bandwidth from both the user and from the server. Google Lighthouse and other site metric evaluators all recommend to resize the images to at most the screen size.

In Hugo, since I defined the it’s easily automated using some functions:

{{ $image_new := ($image.Resize (printf "%dx" $width)) }}

Compression

Loss compression comparison.png

My personal photos are, most of the time, taken in JPEG. Recently I changed the default compression to HEIC for my phone camera, that provides better compression to hi-resolution photos. The web, however, does not allow such format.

Some pictures used to illustrate the posts are PNG. They have better quality at the expense of being larger. Mostly only illustrations and images with texts are worth to have a lossless format.

Whatever the format, I would like to compress as much as possible to waste less bandwidth. I’m currently inclined to use WebP, because it can really shrink the final size to a considerable amount.

{{ $image_new := ($image.Resize (printf "%dx webp" $width)) }}

EXIF metadata

Each digital image have a lot, and a mean A LOT, of metadata embedded inside the file. Day and time when it was taken, camera type, phone name, even longitude and latitude might also be included by camera app. They all reveal personal information that was supposed to be hidden.

In order to share them in the open public internet, it is important to sanitize all images, stripping then all this information. Hugo do not carry these info along when it generates new images. So, for all images get a minimal resize, this matter is handled by default.

Naming

I would like to have a well organized image library, and it would be nice to standardize the file names. Using the post title to rename all images would be great, even more if used some caption of user provided description.

However, Hugo does not allow renaming them. To make matters even worse, it appends to each file name a hash code. A simple picture.jpeg suddenly became picture-hue44e96c7fa2d94b6016ae73992e56fa6-80532-850x0-resize-q75-h2_box.webp.

A incomprehensible mess. If you know a better way, let me know.

So What?

So, if most of the routines can be automated, that’s the problem?

The main problem is that Hugo have to pre-process ALL images upfront. As mentioned in the previous post, it can take a considerable amount of time. Specially if converted to a demanding format to compute such as WebP.

Netlify is constantly reaching the time limit to build the site, all because the thousands of image compressions. I am planning to revert some commits that I implemented WebP and rewrite them little by little, allowing Netlify to build a version an cache the results.

There are some categories of images:

  • gallery full-size images: there are hundreds of them, it would take a lot of the processing time but I will have the metadata extracted from the originals. The advantage is that they are rarely clicked and served.
  • gallery thumbnails: the actual images that are shown on gallery mode. They are accountable of the biggest chunk of the main page overall size when a gallery is in the top 10 latest posts.
  • post images: images that illustrate each article. They are resized to fit the whole page, so when compressed they represent a nice saving.
  • post top banner: some posts have a top image. They are cropped to fit a banner-like size, so they are generally not that big.

I will, in the next couple of hours, try to implement the webp code on each of these groups. If successfully completed, it will save hundreds of megabytes in the build.

Bonus Tip

Hugo copy all resources (images, pdfs, audio, txt, etc) from the content folder to the final public/ build. Even if you only use the resized ones. Not only the build becomes larger, but the images that you wanted to hide the metadata is still online, there. Even if not directly pointed in the HTML.

A tip for those that are working with Hugo with a lot of images processed: use the following code into the content front-matter to instruct Hugo to not include these unused resources in the final build.

cascade:
  _build:
    publishResources: false

Let’s build.

Edit on 2021-08-25

I discovered that Netlify has a plugin ecosystem. And one of the plugins available is a Hugo caching system. It would speed up drastically the build times, as well the possibility of converting to Webp all images once and for all. I will test this feature right now and post the results later.

Edit on 2021-09-13

The plugin worked! I had to implement it using file configuration instead the easy one-click button. Building time went from 25 minutes to just 2. The current cache size is about 3.7 GB, so totally understandable.

It will allow me to must more frequent updates. Ok, to be frank: it will not restrict the posting frequency. However, patient, inspiration and focus are still the main constrains on blogging.

On netlify.toml file on root, I added:

# Hugo cache resources plugin
# https://github.com/cdeleeuwe/netlify-plugin-hugo-cache-resources#readme
[[plugins]]
package = "netlify-plugin-hugo-cache-resources"

[plugins.inputs]
# If it should show more verbose logs (optional, default = true)
debug = true
Bruno 𝕄𝔸𝕊𝕊𝔸