Food labelling is so dishonest

Food labelling sucks. I’m not talking about all the crap that gets put into food that is listed in such small print you need Ant-Man to read it for you. Even the basic information like calories is really misleading.

Background

I’m fat. It’s not my hormones or anything else. I eat too much. What you feed a family, I will quite happily eat for myself. For the amount I eat, I’m surprised I’m not even fatter.

A few months ago the doctor suggested I go on some drugs to lose weight. You know the kind of things I’m talking about, because they are very “in fashion” at the moment. Losing weight would definitely help some of my other medical conditions, so I totally understand why they would suggest that.

The issue I have with this is I’m in my mid fifties, and the muscle and bone density I would likely lose as a result of these drugs would result in a worse outcome for me than just eating like a human, rather than a garbage compacter. Rather than take more drugs I’m trying to put my garbage compacter days behind me.

Over the last few months I’ve been eating a normal volume of food and I’ve lost a lot of weight. Enough that a number of people have commented. I’ve got some way to go, but the movement is in the right direction.

This is the background to my current gripe.

Calories

I come from a science background and believe in “calories in vs calories out”. I know that if I eat less calories than I expend I lose weight. I know if I drop the calories too far I lose weight really quickly, but it is hard to sustain. If I stay on a smaller calorie deficit I drop the weight more slowly, but I don’t feel like I’m suffering.

The other nice thing about tracking calories is there are no foods that are banned. I just have to make sure at the end of the day I hit my target. If I really fancy something, I have it. I just have to juggle things to allow for it.

Although I still get cravings, I find myself making better choices over time, because I know calorie dense foods will mean eating less volume of food in total, and I will feel really hungry later.

If you use some other methods to lose weight and it’s working for you, that’s great news and keep doing it! For me, logging my calories into MyFitnessPal works. If I eat too much one day, I eat a bit less the next couple of days to balance it out. It’s the only way to keep myself honest and not kid myself it’s not my fault.

So back to the subject of this post.

Food label obfuscation

It really gets on my nerves how dishonest food labels are. They don’t explicitly lie, but they make it so hard to make accurate judgements. I’m a huge fan of noodles, and all noodles have crap labelling. Here’s an example of the calorie information on the label of my favourite brand.

  • Per 100g (as prepared) = 109 kcals
  • Per serving (as prepared) = 76 kcals

Clearly a packet is a single serving right? Nobody would eat less than a packet of them right? Well, after some digging I found that per packet, which makes a small bowl of noodles, the calories are 322 kcals. So the portion size they are suggesting is about a quarter of one little packet. I know I’m a fatty, but that portion size is crazy.

It’s clear the real portion size is one packet per person, so why not put on the label it is 322 kcals per packet? Well then people would feel bad about eating 2-3 packs each, which is what I want to do. πŸ™‚

This type of labelling is clearly designed to deceive consumers into thinking the food is low calorie, when it is not. I’ve googled my favourite brand of noodles and I can see loads of people misquoting the calories as 76, when it should be 322.

Portion sizes

In the previous section I was also making a comment about portion sizes, but it came from the angle of obscuring the calorie information, so I had to dig to find out what the portion size meant. In this section I want to talk about products that clearly mention ridiculous portion sizes.

Cereals have been a long-time annoyance to me. I rarely eat them as they are just a bowl full of carbs, but the portion sizes on them are laughable. Depending on the cereal, they will often have a serving size of 30-40g. Get a food scale and measure 40g of cereal and look at it. There is no way an adult eats 40g of cereal.

Sometimes labels give nutritional information including the milk, but sometimes they won’t.

The point is, the portion sizes they recommend are in no way equivalent to what people are eating. Even thin people. My nephews are like skinned rabbits and they easily eat 4 times this per bowl.

It’s so easy to read the label and think you are getting 100 calories, when you may be getting four or five times that, even if you don’t count the milk.

You are being set up to lose

I think food labelling makes it so hard to keep a track of what you are eating.

I hear people saying stuff like, “I’ve tried every diet and I just can’t lose weight”, and they really believe it. If you are not losing weight, you have to eat less. Exercise is a bonus, and good for general health, but you can’t exercise off a bad diet, unless you are running marathons every day. Normal people will only lose weight by eating less calories, however they manage to do it. For you it might be paleo/carnivore, for someone else it might be vegan. However you get there, you are eating on a calorie deficit to lose weight.

Not giving people information in an easy and honest manner is setting them up for failure. We should not expect people to no-life on the internet to find out that it’s 322 calories, not 76 calories, in their noodles.

Overall

Try to be sceptical of food labels. In the early days it pays to weigh things until you get a good eye for portions. If you start to gain weight, or stop losing weight, start measuring again to retrain your portion control.

Also, try to think of the positives. I’m saving so much on food bills that I can nearly afford the ridiculous money I pay for cat food. πŸ™‚

If you are on a health journey, good luck!

Cheers

Tim…

Automation needs updating too

The automations you build are another piece of software, and like all software they need maintenance and updating from time to time. Here’s a little story to illustrate this.

Background

I’ve written about automation patterns for typical DBA tasks before, and the overall pattern remains true, but in recent months we’ve had to make some changes.

  • We still use Teamcity as the automation server for most of the automated DBA tasks, but the architecture of this has now changed a lot.
  • We no longer use Artifactory to hold our software repository.
  • We no longer use Artifactory as our container registry.
  • We no longer use BitBucket for our cloud-based Git repositories.

I’ll discuss each of these below.

Teamcity

Our original installation of Teamcity was a manual installation on a Windows server, with the database housed on a shared SQL Server. The maintenance of that was a bit of a pain, mostly done by my boss, so a while ago he asked me to look at using a container version of Teamcity.

We were in the process of refreshing our SQL Server shared service, so I moved the Teamcity database on to a newer SQL Server Always On cluster. After a bit of messing around I finally got the container version of Teamcity to work, and that’s how we have run for some time now.

The only bump on the horizon was we were using an Oracle Linux 7 (OL7) server as a build agent. The build agent is the thing that actually runs builds and contacts the servers for deployments and to run automations. After getting Teamcity working as a container I conveniently “forgot” about the build agent… πŸ™‚

More recently I returned to the build agent issue, and fired up a Teamcity agent container and started to play with it. There was a bit of messing around, but I eventually got it to run scripts on the destination servers. That solved most of the problems. Next I had to modify the image to include a few different versions of OpenJDK to support some builds. Extending an image is pretty simple. It’s just another Dockerfile. The final step was to let the container build Podman images. The Teamcity agent supports “docker-in-docker”, but from what I could see not “podman-in-podman”. After a bit of thinking I just decided to run the builds on a separate podman server, so the agent was just calling scripts on another server. It saved having to reinvent the wheel. With that done, the container version of the build agent was capable of covering all the bases.

We are still testing some of the developer builds, but all the DBA automations are now running on the new container agents, so happy days.

At some point we may ditch Teamcity and move all automations to GitHub Actions, but that is for another day…

Software Repository

We have an on-prem software repository that holds all our software (Oracle DB software, Java, Tomcat, ORDS, SQLcl, APEX, patches etc.). This was originally using Artifactory running on-prem. We could have used a cloud solution, which we do for the development build artefacts, but the idea was to keep the software repository close to the on-prem servers, otherwise we would be constantly downloading stuff from the internet, which would be a pain.

It was decided to remove this and replace it with an NGINX server to deliver the software. A little while ago we had switched from running Artifactory on a VM to running it in a container. So I just fired up an NGINX container with the relevant config, and downloaded the software from Artifactory into the new NGINX software repository. A little search and replace of some automation scripts and all the software was being pulled from the new location.

Keeping the software repository up to date is a simple as destroying the existing container and recreating it with a newer image. It takes a few seconds. Of course the software is kept in a persistent volume, so it’s not lost. πŸ™‚

I’ve written about software repositories here.

Container Registry

When using containers we want the ability to build an image once and use it for multiple containers. Typically you do that by building the image and pushing it to a container registry. When you need a new container, you pull the base image from the container registry and fire up the new container using that. We were originally using Artifactory as our container registry, but with the move to GitHub, we decided to switch to using the GitHub Container Registry (GHCR). I’m no expert on container registries, but apart from some firewall changes, the switch was effortless. It’s just a different location in your “docker push/pull” or “podman push/pull” commands. Since all our builds and deployments are automated, it was just a search and replace in some scripts to switch to the new container registry.

Git

We’ve used BitBucket as our cloud-based Git solution for many years. Over the last few years newer projects have started to use GitHub Enterprise, with GitHub Actions as the automation server.

In the last few weeks we’ve transferred all of the Git repos used for our automations across to GitHub. The transfer itself wasn’t too much drama. The bigger issue for me was having to visit loads of servers and alter the locations of their git repos and test them. Most of our automations connect to servers, do a git pull of the latest config and automation scripts, then run them. So each server has one or more Git repos to support automation. It wasn’t hard work, but it was dull.

Conclusion

I’ve said a number of times in the past, the tools are not really the important part of automation. It’s the approach and attitude that matters. This is a perfect example of that. In this case three of the four tools have changed, and the fourth has altered architecturally. The end result is the same, but the tooling has changed a little.

Automation code is another piece of software, and it will evolve over time. One of the reasons we don’t overcomplicate our automation is we want it to be easy when things change, and they always do.

Cheers

Tim…

Microblogging : Death by fragmentation

Microblogging seems to be in a death spiral at the moment. Here are some observations based on my accounts. I’m not saying this is true for everyone, but it certainly is for me.

Shop Front

I’m going to use the term “shop front” throughout this post. What I mean by it is when someone posts about their latest blog post or video, then disengages. It’s just a notification service. Not really a way of engaging.

I’m not saying people are trying to sell a product as such. Just promoting their content using microblogging platforms. I’m not saying this is a bad thing, because I do it myself. It’s just a comment about the style of use.

Twitter/X

The former king of microblogging is now a wasteland. Twitter was always a dumpster fire, but in the past there was some balance. Now it seems to be a war between two extreme ends of the spectrum, with very little happening in the middle. My timeline is full of adverts and posts by people I don’t know. The signal to noise ration is terrible.

Even the people I do know are using Twitter in a very different way now. People used to be actively engaging, but now they use it like a shop front. If I’m honest, I do the same thing. Twitter is not a nice place to be now.

I still use Twitter for DMs, but I notice people take a lot longer to answer these days. I guess they are only logging in once a day to check things, rather than many times a day as they used to.

I have a lot of followers on Twitter, but the engagement is terrible.

Mastodon

This really is the platform that never was. I have a small number of followers, most of which also follow me on other platforms. If I post something, I know a couple of people will respond on Mastodon, but really it is dead in the water. If I didn’t already have the account, I wouldn’t start one now.

Facebook

My Facebook is private, but I do have a public Facebook page for my Oracle content. I post links to blog posts there. There is very little engagement, which doesn’t surprise me, because I put in very little effort, and I don’t think it is a good platform for that type of content. It is just another shop front for me. This is another case of if the account didn’t already exist, I wouldn’t start it.

LinkedIn

I get the most engagement for Oracle content on LinkedIn these days. If I were forced to pick a single site, it would be LinkedIn. It’s not a place I would post random comments about my cat’s bodily functions, but I see some people treating it less like a business site all the time.

LinkedIn is one of the few places that feels less like a shop front these days.

BlueSky

I know a lot of the Oracle community have moved on to BlueSky now, but the engagement is really low. Going from 15K followers on Twitter to a couple of hundred on BlueSky has an impact. From what I can see it’s just another shop front for most people. After their announcement post there seems to be little in the way of engagement.

Threads

Absolutely just a shop front for me. Of the few followers I have, most follow me elsewhere also, and not surprisingly engagement is really low.

TikTok & Instagram

Gross generalisation incoming, but young people only care about TikTok and Instagram. All the other platforms don’t even exist to them. I’m not sure TikTok and Instagram are the right place for my content, but from a business perspective, if you are targeting the younger generation, you might as well forget the rest and just focus on these.

Just to confirm something raised by Philipp Hartenfeller on Twitter, I’m not saying only young people use TikTok and IG. What I’m saying is young people rarely use the other platforms.

Fragmentation

I remember going through periods where I felt I lived on Twitter. It felt really vibrant and engaging. Now microblogging is so fragmented it seems like more effort than it is worth.

In the past someone would make a comment, and their comment would spark a chain of responses. Now it seems the odd “like” is about all people can muster. I don’t blame them, because I’m the same. In the past some posts would have “made me” respond. Now I’m just like “whatever”.

Just like any community, there is a minimum viable number of active people required to make things happen, and at the moment microblogging feels so fragmented that all we have left is a couple of inbred families sitting on the porch playing their banjos. πŸ™‚

Profitability?

beat_ramseier on Mastodon raised a good point that I thought I’d add here. Having seen what happened to Twitter, I think a lot of us feel reluctant to go through the process of growing our followers again, because microblogging is not profitable for the company running the service, and any platform that is successful risks breaking under the load. Faced with that, they may be forced to get more external investment, which will no doubt turn them into the new Twitter. It all feels a bit “once bitten, twice shy” for me.

Cheers

Tim…

PS. If you do want to follow me, you can find all my socials on Linktree.

Trivialising past events

A few weeks ago I read an article that was trivialising the Y2K bug.

Y2K History

For those that don’t remember the lead up to the year 2000, it was a really big thing. It stemmed from the fact that loads of computer systems used 2 figure years (YY) in their interfaces. If they were lucky they stored the dates properly, using some rule to convert them, but that wasn’t always the case. The problem was that as we switched from the 1900s to the 2000s, many of those systems were going to store 00 as 1900, rather than 2000. That was a big problem, and we really didn’t know how big the impact would be.

Incidentally, I notice some interfaces now seem to use 2 figure years again, so I guess we are creating a new set of issues for the year 2100 (corrected by Yngvi Þór), but that won’t be my problem.

What we did

A very large amount of money was invested by companies to work through their applications, checking for incorrect date usage and fixing it. Almost everyone I know who worked in tech at the time was doing this stuff.

Remember, this is back in the late 90s, so it wasn’t as easy as firing up a VM, changing the system date and testing your app. For one job myself and a colleague were flown over to Cincinnati to test a system, because it was cheaper to fly us to some test kit, than it was to get the test kit to us.

It was a massive investment in time and money, but it worked out. Y2K came and went and we didn’t have planes falling out of the sky, and most systems carried on working. There were a few things that failed, but it wasn’t really headline news.

Trivialising Y2K

So back to the article in question. It was essentially making out we were all stupid for worrying about Y2K because it didn’t end up being a problem after all. Well duh you muppet! We fixed all the problems before the deadline. That is why planes didn’t fall out of the sky and your bank accounts weren’t trashed!

Don’t try and make out you were confident this wouldn’t happen before we did all this work, because if you do you are either stupid, or a liar!

It happens all the time

This type of thinking really gets on my nerves. It’s a type of historical negationism. It’s like all those people now trivialising COVID-19, because they conveniently forgot about all the people that died unnecessarily before the vaccines came online, and the 13+ billion doses of COVID-19 vaccines that helped keep a lot of us from getting hospitalised, or dying.

Conclusion

When your favourite celebrity, politician or cult leader starts doing this crap, call them on their bullshit. It’s not big and clever to trivialise history. It’s a big problem.

Cheers

Tim…