The HP x2 2-in-1 Notebook, and How Well It Runs Fedora

I was working with a company a few months ago who had just bought a bunch of notebooks (HP x2 2-in-1 notebook laptops) so as to loan out to their clients. Since they had such a tough time getting the notebooks configured properly, I look into getting GNU/Linux working on the notebooks. I had initially tested Linux on one of their old notebook tablets, and that looked promising. However, I didn’t stay around long enough to proceed with further testing (my contract ended). Still, I thought the notebooks they bought looked interesting, and got one of my own. I then proceeded to install Fedora on it.

The HP x2 2-in-1 Notebook in all its Fedora glory

Initially, I thought of installing Ubuntu, because it has great hardware support. To my surprise, Fedora worked because it had the proper bootloader set-up (it had its UEFI files aligned correctly). There was that reason, and I have been wanting to learn more about using Fedora, so this was a good match.

The model I have is an HP x2 Detachable Notebook, model number 10-p018wm, and it took two tries to get the notebook to boot from the LiveUSB. I had to disable secure boot (as usual with notebooks like these), and I had to figure out how to get to the boot selection screen (press escape as the device is booting up, then press F9). The installation process of Fedora 29 was smooth, too. Though I had to remove Windows because I couldn’t free up enough space in storage. By default, on Fedora 29, Gnome is installed. Using that was a bit of a chore.

When people criticize Gnome, half the time the criticism seems to be on the design. The problem didn’t necessarily lie in the design of the interface. Navigating through the windows, bars, application screen, and other stuff in Gnome 3.30 was easy, since Gnome’s design is now mostly suited for touchscreens. The problem was in how sluggish it all was. Since the interface rendering wasn’t really optimized for a notebook like this one (the on-board GPU is laughable, it comes with an Intel Atom CPU, and it only has 4GB of memory), I couldn’t do much with it. So I had to put the notebook aside for a while.

After about a month, I thought, Why not test out some other desktop environment on it? It’s not that hard to switch to desktops on Fedora. So I installed the other big desktop environment, KDE. I had heard that KDE is usually hard on the resources, mainly because it offers such pizazz and eye candy for the desktop. Since I was testing this notebook out anyway (and because it had been a long time since I had used KDE), I didn’t expect the findings to be amazing.

So I installed KDE (first the Xorg version), and logged into that. The look, polish, and experience blew me away. Not only does the Plasma desktop (version 5.14.5) run very well on this notebook, the desktop has some support for touchscreens. The applications menu launcher can be changed to one which is touch friendly, swipes and drags on some menus respond well (e.g. in System Settings), though scrolling is still handled by using the scroll bars on windows. It’s nice that KDE has some support for touchscreens. Gnome has it beat in this regard. Under Gnome with Wayland, the touchscreen is well supported: the swipes and drags of a finger work, and in many applications, two finger zooming and expanding work, too (at least for the ones which can expand and zoom). Other features of the notebook, though, are lacking.

Only one of the cameras (the front facing camera) on the device works; the other one (the back one) is undetected. As far as I can discern, the kernel I’m running (4.20.16) doesn’t detect the back camera. The accelerometers on the tablet, while detected, are only used in Gnome. Special methods had to be found just to get some screen auto rotation functionality in KDE. What’s worse is that, since there is little Wayland support in KDE (especially kwin), the auto rotation function can’t be used. The problems with Wayland also extend to the HP x2 Notebook’s stylus.

The HP Active stylus is detected and supported, but not quite. Stylus support in Wayland is there, but it doesn’t always work as intended. Under Gnome (and using the Krita program), the stylus mostly works: it’s detected as a separate input, and it has pressure sensitivity. With KDE and Wayland, though, the stylus is not detected at all, and so can only be used with Xorg. This is fine for most applications. However, getting a little multi-touch support is nice. In Krita, for instance, two finger zooming and expanding is supported. Other smaller capabilities are supported, too.

In the past, I had read that similar notebooks to this one lacked support for sound; the user couldn’t get sound working. I am happy to report that sound is working under Linux. I was able to watch a couple of YouTube videos on this tablet, and sound could be heard. Bluetooth is also recognized, though I did not use it. The battery was recognized, too, and from the looks of it, the device has great battery life. As for its suspend and sleep capabilities, the HP x2 Notebook could fair better. When closing the lid, the device goes into suspend. However, after a while, it won’t go into hibernation. I have tried making it go into hibernation a couple of times, but it doesn’t seem to do that. I have also changed the options in the /etc/systemd/logind.conf file (setting the HandleLidSwitch and HandleLidSwitchDocked options to “hybrid-sleep”). However, that doesn’t seem to affect the suspend and hibernation. This is great as a notebook, but as a tablet? That looks uncertain.

When detached, an on-screen keyboard can be used. The keyboard accessibility feature in Gnome can be used as an on-screen keyboard, though it doesn’t always pop up, e.g. it doesn’t show up when typing in Chromium. Another on-screen keyboard, onBoard, can be used as well. Though the user has to tweak the program to their likings. Otherwise, wherever a tap is necessary, KDE can provide it. Since Chromium and Firefox already have touchscreen support, so navigating sites aren’t that difficult.

The upright screen. What fun.

The support for the HP x2 Notebook has grown tremendously since it was released a couple of years ago. It’s great that a nice little notebook like this one is finally usable under Linux. It can be used for small stuff like surfing the web, and possibly used as a tablet. Though it can’t be used in a similar fashion to its Android brethren, it does desktop stuff nicely.

Marketing, Research, and Chasing Trends in Video Games

In a recent post by Jerry Holkins, a.k.a. Tycho Brahe of Penny Arcade fame, he talked about how some marketers of games may track how well a game is doing. He wrote of how Apex Legends is now dominating Twitch, and it has overtaken Fortnite in the top spot. Jerry notes how he once overheard a marketer say that, if you’re not near the top of Twitch’s list, you don’t exist. Jerry then wondered that, since viewers usually aren’t playing the game that they are watching, then should this really be the main concern of a video game marketer? More of, should the number of viewers of a game be indicative of how well a game will generate revenue? He discusses how the “core metric” of advertising has changed from “page views” to “unique” views, and tracking these may not pay off. For media that’s found on the Internet, the Kindle, or even video games, these constant change-ups are less likely to pay off. As Jerry writes, “The best you can hope for in media is that a billionaire pays to keep you around as a pet, the same way a supervillain might deign to have a white cat close at hand.” And this is one of the tragedies of this sort of advertising.

A marketer will collect all of this information (these “metrics”), hand it off to an analyst, and ask how those metrics transform into profit. The analyst will try their best, do some research, perform some careful analysis, and hopefully come up with a winning model. If it predicts how the market will go, then the analyst succeeds, the marketer succeeds, the firm succeeds, and everything is peachy. That doesn’t always work, though.

Analysis is tough. The researcher doing the analysis must ensure that they’ve had enough time to analyze the data, gain valuable insight, and have explored it enough to know whether a change in one variable will actually affect another variable (preferably the variable connected to profit). This process could take weeks, months, or even years just to gain knowledge that describes the market. How should they approach analyzing it? What should be the methodology? Another problem could be with the data collected. Was it tainted? Did the researcher have a large enough sample size? They don’t know until the analyst can make a good prediction of the entire trend of the target market. If that doesn’t work, then the researcher has to collect more data, and marketers don’t have that sort of time.

When a potential market is found by a marketer, they must ensure that, for their clients, they maximize the client’s profit. Usually, time is of the essence; time wasted is money wasted. So when a researcher finds a promising link between a “core metric” and the chance of profit, a marketer must seize on it, or lose their job. So they push on, hoping that the market research gained is a solid lead to profit. Since market research is, presumably, an exact science, where conclusions are carefully derived from good data and unbiased observations, the link that the marketer finds between the “core metric” and the gain in profit may be only as good as a roll of the dice.

Market research, especially for markets pertaining to video games, movies, digital books, and web advertisements, is difficult to pin down. While market research in the Internet age has matured over the past 25 years, with research on which models work, and which trends could pay off, marketers are still chasing that dragon, that metric found in some market which will lead them to riches. Sadly, the research and analysis they may do could be incomplete, either due to not enough research, a small sample size, poor data, or just incorrect observations. Since the marketer has to advertise and push the product in a market, while the market still has profit to be gained, they cannot wait while the data is analyzed, trends are found, and facts are verified. And so, the public will hear about another market force, a new “core metric”, every quarter, every year, when a researcher publishes a hot find about a new video game, a new book, a new Internet fad, or just a new source of entertainment.

A way of describing IT audit

You know how your day goes? You know how you use all of these widgets and doohickeys, and that’s how you usually converse or get any information? It’s my job to keep all of that stuff working (at least, I would like to). I’m not talking about just making Windows work on your computer, or troubleshooting your iPhone. I’m talking about the systems and processes which are used by those devices. You see, there are complex processes at very large companies which must keep running. You can’t send text messages? You can’t buy stuff off Amazon? You can’t watch cat videos? Something has gone wrong in those services, and it’s up to guys like me to prevent that from happening.

So I’ll check that the servers which provide the videos can still function when they’re running at full tilt, or when there’s barely a trickle coming in.  Regular phone calls and text messaging have complex systems which must  be checked to ensure they’ll do their job every single time. This includes gathering evidence, and assuring that what management claims about the workings of their systems is true. Has the system administrator updated the server’s OS to its latest patches? Did they test to ensure that the patches wouldn’t break anything? Are change controls in place to ensure that they even check that these patches wouldn’t break the system? How about security procedures? Have they been followed? Has the system been hardened to identified threats? These are some of the things IT auditors have to check.

This is not just about checking the hardware, though. Has management performed its due diligence and made sure that the data they have is backed up? How do they know whether the data that was backed up could be restored? Who is in charge of checking that the data is backed up and stored properly? If the servers go down, who’s job is to ensure that they all go up again? More questions can be asked, but I’m sure you get the gist of it.

So you may have heard of company’s getting their data breached, or have announced new services they will provide. However, it’s up to someone at these companies to ensure that governance practices have been followed, that the company can actually provide this service, that the service won’t expose the company to risks (e.g. litigation, huge losses, or data breaches), and will actually be a benefit to the company. And usually, it’s someone who knows about IT audit.

How Digital Coupons Work

If you’ve been to a few supermarkets lately, you may have heard of digital coupons. If not, they are virtual savings which customers at brick and mortar stores can use towards their purchases (these are in contrast to discounts given in digital stores). So how do these imaginary savings work? In this article, I’ll give you a small crash course in this stuff, showing how the user signs up, how it may run through a supermarket’s database, and how the coupon is processed at the checkout. Though let me first say that I only have a small understanding of digital coupons. From what I’ve learned about information systems, how point of sale systems work, and a few of the concepts which string them all together, I can tell you only an overview of how it all works. It is my intention to inform and demystify the user, because an informed customer is a happy customer. So let’s get to it.

There are supermarkets across the U.S. which require a club card in order to purchase the store’s items at a sales price. Higher management at these supermarket chains thought that getting some customer data wasn’t enough, nor were the store’s services providing enough value. So they created these digital coupons which were exclusive to their store. With these digital coupons, a customer opts in to their service, then the customer applies these coupons to their club card, and they’re usually applied at check out. How this all works, though, is a bit more complicated than that.

The customer first enrolls in the digital coupon service. In the supermarket’s system, an entry is made in their database which tells the point of sale system that the customer uses digital coupons, and to be sure to check for this. The customer then adds coupons via an interface (either a web portal or a mobile phone app). The system assigns these coupons to the customer’s file in the supermarket’s database. At check out, the cashier scans the items as usual, and the customer’s club card is applied. Depending upon how the point of sale system is set up to look for the digital coupons in the company’s database, the digital coupons may be applied during the order, or near the end of the order. If it’s near the end of the order, the cashier totals out the order, and the point of sale system evaluates the order, matching items to digital coupons in the customer’s database file. The look-up process can be a bit tedious, because the digital coupon system could be set up such that the digital coupons are matched against the UPC of the product, and so it could take a few seconds, especially for large orders. Finally, any discounts are applied, and the order continues to its end. Is there anything else the digital coupon system can do? You bet!

The system also keeps track of what the customer has purchased. This information can then be used in a few ways. For one, the data can be used to improve the processes and products of the supermarket (though this has been a part of club cards for years, and not just with digital coupons). For another, it can be sold to third parties (mind you, supermarket chains have been monitoring these practices closely so as not to harm the customer, as well as not increase risk for the company). To add to this, a company can make a direct connection to how a customer purchases a product. The company can take this information to the manufacturer of the product, and negotiate better prices on the product. Or, the supermarket company can provide a personal discount on the product because they know that the customer will continue to purchase the product, and the supermarket company can still turn a profit.

Again, remember this is a simplification. The supermarket company many do little with the data they receive. The company could merely provide digital coupons, along with some other perk (such as having a free product every so often, e.g. a free sandwich). Also, how the supermarket company uses the data could be very different from what I have described. For all I know, it could be used in logistics, in advertising, or even in the layout of new stores. Just know that a system like this can be complicated, can be a bit tedious, and yet can also provide a good amount of value. How much are we talking? From my experience, a customer can gain 4% to 10% more with digital coupons (though I’m sure it’s mostly 4%).

How GNU/Linux is different from other OSes

If you’ve heard of GNU/Linux and want to try it out, then congratulations and welcome to the world of open source OSes! I hope that your experience with using GNU/Linux is a joyful and informative one. If you’ve already done a little homework, you’ve found many variations (called “distributions”) of GNU/Linux, and may be mystified by how many there are. I’m sure you’ve come from Windows or Mac OS and have wondered, Why is it that there’s one Windows/Mac OS, and so many different Linuxes? There’s a reason for this, though to understand why there are so many distros (that’s short for “distributions”), we must first explain the purpose of some OSes.

Microsoft makes Windows to be used by the average person, as well as business-oriented people. In fact, they’ve targeted quite a large group (I’d say the entire human race, but that’s just me). MS needs to ensure that Windows works with many desktop and laptop computers, functions in such a way that it makes sense, and that the average person can use it. To that end, they have engineered it to work on many different computer architectures, and that their desktop experience is pretty darn good. For their server offerings, they have developed a version of Windows, Windows Server, which works well on enterprise-level hardware, and works especially well in data centers, along with any application that requires servers. This is different from what Apple has done.

Apple created Mac OS X (along with iOS) to work exceptionally well with their terrific hardware. They have managed to create an OS that synergizes with their hardware, that looks beautiful, and is very well organized. Via the Aqua user interface, the user can easily navigate applications, and get things done. Apple has ensured that their hardware is durable, that their mobile devices last a long time, and that the hardware retains a good amount of value. To this end, Apple designed the Cocoa API to smoothly work with their hardware, and that it’s not that difficult to program. Similar to Microsoft, Apple has a large audience, but they have targeted them a little differently. At first, they strived for simplicity and high value. While it is possible for the average consumer to purchase an iPhone, Apple devices are still quite expensive, and so their audience is usually smaller than Microsoft’s. So where does GNU/Linux fall in all of this? Somewhere in the middle.

GNU/Linux consists of the Linux kernel, utility programs, and other supporting libraries. When a person or organization has a use for GNU/Linux, they may take the kernel and the utility programs, and build up their own OS. Canonical wanted to make a Linux distro which worked well with hardware, and provided a great user experience, so they copied (i.e. “forked”) Debian, added in the Gnome desktop environment, worked with hardware manufacturers to ensure devices functioned as intended on GNU/Linux, and developed a great experience. With this, they made Ubuntu. In the past, companies like Red Hat created a similar distro (in their case, Red Hat Linux). They also created Fedora to be made for server hardware. They, too used, Gnome, and make a nice installer. But GNU/Linux is not restricted to desktop hardware.

Linux has been ported to many platforms, ranging from the traditional x86, to RISC, to ARM, and several architectures in between. A company (or individual) can take the kernel, mix in whatever software they wish (be it open source or proprietary), and put it into their hardware. They can design the distro such that it functions how they want it to function. In these cases, GNU/Linux has a special purpose, and the organization may release their distro with that special purpose in mind. So a software vendor may make a special Internet of Things distro, and release that. Or a small group may release a fork of a desktop distro because they wanted to add their own “spin” to an official distribution. It all depends on what the individual, group, or company wants. And this is what sets GNU/Linux from the other proprietary OSes: you can find the one you want, and tweak it however you wish.

For Calculating a Whole-Numbered Grocery Bill

This is a little piece I wrote in 2014.

Here’s the formula for calculating an even amount for your groceries at the supermarket.

Let ”T” be the total price of the order, ”P” be the price of a non-taxable item, ”G” be the price of a taxable item, ”t” be the tax on a taxable item (usually a percentage), ”n” is the number of non-taxable items in the order, and ”m” is the number of taxable items in the order. For the bill of your groceries to be an even amount, the following must be satisfied:

  T = $\displaystyle\sum\limits_{i=1}^n P_{n}$ + $\displaystyle\sum\limits_{i=1}^m G_{m}$ (1 + t)
All one needs to do is find an item Pn or Gm which makes ”T” a whole number. To put it in another way, you must satisfy this condition:

  $\displaystyle\sum\limits_{i=1}^m G_{m}=\frac{T - \sum\limits_{i=1}^n P_{n}}{(1 + t)}$
Where \displaystyle\sum\limits_{i=1}^m G_{m} equals a whole number.

So the next time someone at the checkout stand looks at their total and says, ”I can’t do that again even if I tried,” show them this article and explain to them that it can be done.

How to Add C# Assemblies to a Godot Engine Project

With the release of version 3.0 of the Godot Engine, C# was added as a supported programming language. Due to other engines like Unity, there is a large number of C# assemblies which can help in developing a game. However, there isn’t an easy way to add assemblies in the editor. Therefore, this tutorial was created to help with that problem.

In this tutorial, we’ll use the Godot Engine, version 3.0, along with the assembly for Rant. Note that while a C# assembly can be added to the editor, the assembly cannot be exported to a self-contained executable just yet.

Update: as of version 3.0.5 of the Godot Engine, exporting C# projects with assemblies have become possible.

1. Create project as normal. Where necessary, add a C# script. At first, the engine will generate the necessary C# files (i.e. the “*.csproj” and “*.sln” files).

2. Add in the C# code. After this, close the editor for the moment.

3. Ensure that the correct assembly is in the Godot project working directory.

4. Go into a desirable IDE and open up the “*.csproj” file which was created when the C# script was added. Monodevelop will be used in this tutorial. At this point, ensure that the Godot editor isn’t running. Otherwise, key files may be overwritten.

5. In Monodevelop, right-click on the References node in the tree view for the project and click on Edit References... Be sure to first expand the project so as to reveal References. Then click on the .NET assembly tab and click “Browse…”.

6. Select the desired assembly for the project and then click “Open”.

7. It should then be added as a reference for the project. Click “OK” and save the project (“Ctrl + S” or “File -> Save”). Exit out of the IDE and go back into the Godot editor.

8. Add in a GDscript which makes use of the assembly.

9. If all goes well, the engine should use the assembly in the resulting game.

If you have any comments or questions, please ask them below, or get in touch with me.

Connecting your JLab Intro headphones on Elementary OS

The JLab Intro headphones are a low-cost pair of Bluetooth headphones. Though not the best, the sound is still decent. Here’s how to connect the headphones on Elementary OS.

First, put the headphones in discovery mode. To do this, press and hold the on/off button for about six or seven seconds, until it starts flashing red and blue.

Then use Bluetooth Manager (use apt install blueman to install it) to search and select the headphones (they’ll show up as “JLab Neon BT”):

Once you select it, click on “Create a pairing with device” (click the “lock” icon), and Bluetooh Manager will attempt to pair with the headphones.

Right-click on the device and select “Headset”. This should tell the headphones to act like an audio device.

Go into “System Settings” -> “Sound” and select “JLab Neon BT”. Click on the “Mode” dropdown menu and select “High Fidelity Playback (A2DP Sink)”. Some sort of sound should be playing through the headphones. You can also adjust the volume through the volume control on the headphones.

After that, you can use the “Test Sound” utility to test that the sound is working.

Auditing a charity’s network, and finding something out of place

For the past six months or so, I have been helping a local charity with its I.T. needs. This includes updating their computers, designing and setting up a kiosk for its volunteers, and helping other charity members with their IT needs. Now I’m trying to map their network, and help the director of I.T. to ensure that all devices (desktop computers, printers, external hard drives, etc.) are accounted for.

About two or three weeks ago, I used nmap to do a quick scan of the local network, and check the devices on the network for what they were broadcasting, what ports were open, and just what that exact machine is. After finishing that, I met with their director of I.T. to discuss my findings. He verified most of what I found (we had a problem with one of those Western Digital MyCloud hard drives, but that was soon cleared up). But there was one part which baffled the both of us.

Just like many other small organizations, they use old phones. The ones they use are Avaya IP phones (i.e. voice over internet protocol [VoIP] phones), model 1616. I don’t know when they got these phones, but they’re old. When I scanned these phones, I only got their IP addresses; there was no hostname. However, on one particular phone, I found a hostname. It’s not a hostname you would find on any of the other computers on the network (they had hostnames that ended in “.local”). This one had the hostname “6lfb7c1.salmonbeach.com”. How it got this hostname, I am not entirely certain.

At first, I thought these were phones with a few features (voice mail, call fowarding, conference calls, stuff like that). As I have found out, these phones are fully featured, and have upgradeable firmware. The phone in question, the model 1616-1 BLK, gets its firmware from the local Avaya phone PBX server. Since it gets its firmware from the server, how can the hostname be changed? In the settings for the phone, the hostname can’t even be changed. One of the members of the charity’s administration said that they had problems months ago with the voice mail system. But I doubt that’s related to this problem.

So how should I approach this? Has it been hacked? Is it just a software glitch? Hopefully it’s nothing serious. The I.T. director said that he bought a bunch of these old phones on the cheap years ago, and he’ll look into flashing the firmware on the phone. So let’s hope that’s the last we’ll hear of it.

Using Rant in my Python program because I’m a glutton for punishment

Over the past few months, I have been researching and developing a little procedurally generated game which will eventually be created in the Godot engine. This game will have a story that’s procedurally generated for the players. A part of this game is the dialogue, which will also be procedurally generated. To accomplish this, I set out to find a library of some kind which can create procedurally generated dialogue (or at least the dialogue that I want) and is written in my programming language of choice, Python. From the looks of it, there isn’t one, and so I had to look elsewhere. That’s when I stumbled upon something called Rant. This is billed as a library which can procedurally generate dialogue. At first I thought I had found what I was searching for. Sadly, though, it is written in  the least open source-friendly language I have ever seen: C#. This can be used on Linux (with the Mono runtime). But I’m looking for a solution where I don’t have to use a bunch of programming languages to achieve what I want.

At first, I tried making some kind of dialogue scheme that would suit my needs. I threw in some sentences of what may define the NPC, and mashed it all together. From the looks of it, though, the scheme is getting out of hand. I have several lines of dialogue, and I’m not even finished. I don’t entirely know how I’ll fit it all together, considering this is just for a simple demo of the full game. It looks like I’m going to have to get creative.

I went digging and searching around I came upon several possible ways of integrating C# code into Python code. There’s IronPython, a fully implemented version of Python in C#. The big problem with this was that it didn’t look very portable to me, as I would have to bundle the .NET libraries with the game for each platform, and that’s a royal pain the ass. Then I looked at Python.NET, which looked very promising: you can call some C# code from Python, and you can call some Python code from C#. It looked like the best of both worlds. Now, actually making it work is a bigger problem.

When I tried to use the Rant.dll assembly in my Python program, I found that I can’t do that because, well, it’s C# code, and the regular old CPython (which comes with many Linux distributions) can only import C or C++ code. Then I looked into using the clr module from Python.NET, but I couldn’t find a version built for Linux. Through a lot of hand wringing, brow beating, and code cracking, I found that I had to use the latest version of Mono (version 5.0.1) along with an unstable version of Python.NET. This one built with the suggested command: python setup.py build_ext --inplace. The built shared object library file, “clr.so”, and the “clr” module load in Python.  Heck, I was even able to load the pre-built “Rant.dll”. But this is nothing compared what I must do now: actually making some procedurally generated dialogue with Rant. And I don’t know where to begin with that.