Closing One-Off Tags in HTML5?

I wasn’t sure whether I needed to close my meta tags in HTML5. And br tags, for that matter. People call them one-off tags, unpaired tags, self-closing, monotags, bachelor tags, among others.

They’re called void elements.

According to W3C, the void elements are: area, base, br, col, embed, hr, img, input, keygen, link, meta, param, source, track, wbr.

Under “8.1.2 Elements”:

Void elements can’t have any contents (since there’s no end tag, no content can be put between the start tag and the end tag).

Under “8.1.2.1 Start tags” it says:

Then, if the element is one of the void elements, or if the element is a foreign element, then there may be a single “/” (U+002F) character. This character has no effect on void elements, but on foreign elements it marks the start tag as self-closing.

So, “/>” has no effect on a meta tag. Or any other void element. But it is not invalid to put it there.

Posted in Technology | Leave a comment

WTF is a “Desktop Device App”

Apparently you now have to run the Windows App Certification Kit (WACK) on the software executables you publish in order to avoid a moderately scary warning from Windows SmartScreen.

Upon running WACK, you’re presented with four options:

WindowsAppCertificationKit_StartScreen_DesktopAppDeviceHighlighted

  • Validate Windows Store App

  • Validate Windows Phone App

  • Validate Desktop App

  • Validate Desktop Device App

All of them seem pretty self-explanatory, except for the last one, “Validate Desktop Device App”. It’s described as “Test a desktop device app for compliance with value-added software requirements”. Huh?

Searching Microsoft and MSDN for the phrase “value-added software requirements” turns up zero results. Searching for just “value-added software” (isn’t all software value-added?) led me to this post on Raymond Chen’s The Old New Thing blog which indicates that “value-added software” is the crapware/shovelware that’s pre-installed on computers bought at, say, Best Buy.

I really wish Microsoft would focus less on consistent branding and more on actually explaining things. Incidentally, I gagged a little bit when I read the phrase “Validate Desktop App“.

Posted in Technology, Windows | Leave a comment

Minimal Steps to Fake Authenticode Signature (Self-Signing)

Here are the minimum steps required to self-sign an executable for development and testing:

makecert -sv mykey.pvk -n "CN=MyCompany" -len 2048 mycert.cer -r
pvk2pfx -pvk mykey.pvk -spc mycert.cer -pfx mycert.pfx -po mypassword

Note: You’ll be prompted to create a certificate password and it must match whatever you supply to pvk2pfx with the -po switch.

To sign an executable, use:

signtool sign /f mycert.pfx /t http://timestamp.comodoca.com/authenticode /v executable.exe

Note: once you have a real code signing certificate, you’ll use whatever timestamp server your provider gives you. Comodo works fine for self-signing testing purposes.

To automatically sign a binary at build-time in Visual Studio, add go to your Project Settings | Build Events | Post-Build Event, and add something like this to the Command Line setting:

signtool sign /f MyCertificatePath\mycert.pfx /p mypassword /t http://timestamp.comodoca.com/authenticode /v $(TargetPath)

Explanation of makecert command:

-sv Specifies the private key file.

-n Specifies the certificate name.

-len Generated key length, in bits. This StackOverflow answer indicates that Microsoft released an update blocking certificates with keys under 1024 bits long.

-r Specifies self-signed, i.e. not a root certificate.

Posted in Technology, Windows | Leave a comment

Enabling uiAccess in Visual Studio C++ Projects

After spending too much time fiddling with my project’s Manifest Tool settings, trying to import an “Additional Manifest File”, I realized the solution was actually really simple. Under Project Settings | Linker | Manifest File, there is a simple dropdown for “UAC Bypass UI Protection” which sets uiAccess to true.

VisualStudio_PropertyPages_Linker_ManifestFile_uiAccess

Posted in C++, Technology, Windows | Leave a comment

Windows Console and Double/Multi Byte Character Set

The Windows Console doesn’t support Unicode. It does, however, support Double Byte Character Sets using Code Pages. By changing the system locale, the Console can display Japanese, Korean, and Chinese text:

Code Page 932, Japanese file names and Unicode file content work correctly, UTF-8 file content is gibberish.

Terminology

UTF-8 and UTF-16 are types of Unicode. However, it’s common on Windows to refer to UTF-16 as Unicode, and UTF-8 as UTF-8. I will follow this convention. DBCS (Double Byte Character Set) is the only type of MBCS (Multi Byte Character Set) supported by legacy (i.e. non-Unicode) Windows applications. Japanese, Chinese, and Korean are supported via DBCS encodings. None of these DBCS encodings are Unicode, and all of them are proprietary Microsoft implementations of other standards.

Code Pages Supported by Windows

Windows supports four Double Byte Character Set code pages:

  • 932 (Japanese Shift-JIS)
  • 936 (Simplified Chinese GBK)
  • 949 (Korean)
  • 950 (Traditional Chinese Big5)

The available code pages are determined by your System Locale. If your System Locale is set to “English (United States)”, then these code pages will be unavailable to you. In this post, I will only be covering Japanese, since it’s the only language with which I have any familiarity. The steps and results would be similar for the other languages.

How to Change System Locale

To change your system locale, go into “Change date, time, or number formats”:

StartMenu_ChangeDateTime Select the Administrative tab, and click on “Change system locale”. Select the new system locale, click OK, and reboot. The system must be rebooted to change the system locale:

SystemLocaleSetting

Windows Console Font and Code Page

The font typically recommended for Japanese output is MS Gothic. I have, however, found that Japanese text displays with the Terminal font selected, but it’s entirely possible that the UI is lying to me.

To change the Windows Console code page, use the chcp command. chcp with no arguments will display the active code page.

Code Page 932 (Japanese Shift-JIS)

With the code page set to 932 (Japanese Shift-JIS), the path separator character will change into the Yen symbol (because only the backslash and tilde characters differ from ASCII in the lower 7-bits of Shift-JIS). Japanese file names will display in Japanese, as will text saved as Unicode. Japanese text saved as UTF-8 will display as gibberish:

CMD_CodePage932_SystemLocaleSetTo932_MSGothicFont

Code Page 65001 (UTF-8)

I have found that it will sometimes work to set the code page to 65001 (UTF-8). Japanese filenames, Japanese Unicode file content, and Japanese UTF-8 content will all three display, as shown below. However, when I experimented with this it stopped working after changing fonts and code pages a few times. My final impression is that it should work, but that the Console has some bugs in this regard.

CMD_CodePage65001_SystemLocaleSetTo932_RasterFont

Here’s a screen shot of the Console after code page 65001 stopped working as expected: Code Page 65001 (UTF-8), Japanese output stopped working

References

Posted in Technology, Windows | Leave a comment

Mediawiki Error: “Error creating thumbnail: Unable to save thumbnail to destination”

I have an older Mediawiki installation which, after upgrading to version 1.23.0, I began to see errors like:

Error creating thumbnail: Unable to save thumbnail to destination

The problem turned out to be the temporary folder setting in LocalSettings.php. It was pointing to ‘tmp’ under ‘images’, which didn’t exist. Instead, there was a folder named ‘temp’. The solution was to change:

$wgTmpDirectory     = "{$wgUploadDirectory}/tmp";

into:

$wgTmpDirectory     = "{$wgUploadDirectory}/temp";
Posted in Technology | 1 Comment

Unicode and the Windows Console

Update: after several more hours of Googling and experimenting, I have found a way to display Japanese in the Console. For more information, check out my new post, “Windows Console and Double/Multi Byte Character Set“. The rest of this post is still accurate with regards to Unicode support and Western system locales.

Have you been hoping to see Japanese (or Thai, Hindi, Arabic, etc.) characters appear when you type dir into your command prompt? Well, prepare to be disappointed, as the Windows CMD.exe Console cannot display Unicode characters. You’ll have to use the Powershell ISE if you want to see full Unicode text output.

The best that the Command Shell can do is to write out boxes or question marks and, when characters are marked and copied, the clipboard will be populated with the correct Unicode characters. Those characters can then be pasted into smarter executables, like Notepad.

Michael S. Kaplan, an expert on all things Unicode and Microsoft, wrote about this at great length on MSDN Blogs. Unfortunately, Microsoft decided to wipe his blog from the Internet, even though it breaks links from the likes of Raymond Chen’s The Old New Thing.

Michael’s relevant blog posts can be found on The Internet Archive’s Wayback Machine:

Posted in Technology, Windows | 1 Comment

Biostar TF7050-M2 Gigabit Ethernet under Ubuntu

After several years of not bothering, I finally got Gigabit ethernet working on my old Linux server, which has a Biostar TF8050-M2 motherboard. The motherboard has a Realtek 8111B PCI-E LAN controller, which is supposed to be 10/100/1000 (Ethernet, Fast Ethernet, Gigabit) capable, but when transferring files over Samba, I never saw more than 10 megabytes per second. After fixing the issue, my network transfers jumped to 41MB/s, as reported by TeraCopy.

Verify Current Speed

To verify that you’re running at less that Gigabit speeds, run ethtool. Here’s the output after I fixed my problem:

$ ethtool eth0
Settings for eth0:
        Supported ports: [ TP ]
        Supported link modes:   10baseT/Half 10baseT/Full
                                100baseT/Half 100baseT/Full
                                1000baseT/Full
        Supported pause frame use: No
        Supports auto-negotiation: Yes
        Advertised link modes:  10baseT/Half 10baseT/Full
                                100baseT/Half 100baseT/Full
                                1000baseT/Full
        Advertised pause frame use: Symmetric Receive-only
        Advertised auto-negotiation: Yes
        Speed: 1000Mb/s                             <=== Look at this line
        Duplex: Full
        Port: Twisted Pair
        PHYAD: 0
        Transceiver: internal
        Auto-negotiation: on
        MDI-X: Unknown

If your listing shows "Speed: 100Mb/s", then you're only getting Fast Ethernet (100 megabits per second) speeds.

Replace the Driver

I replaced the default Realtek driver, which is for the r8169 (check which driver you have by running lsmod | grep r.* and look for something like r8169), with an updated R8168 driver from the Realtek website.

I followed the instructions from this Ubuntu Forums post, but I found that they can be reduced to the following:

  1. Download the latest drivers from the Realtek site.
  2. Unpack the archive (tar xvjf r8168-x.xxx.xx.tar.bz2).
  3. Change into the new directory and execute autorun.sh with administrative privileges (cd r8168-x.xxx.xx followed by sudo ./autorun.sh).

The autorun.sh script will automatically remove the existing r8169 module and replace it with the correct r8168 one. This worked, even over an ssh session, though the connection dropped while the operation completed.

Power Cycle the Switch

After updating the driver, I was still only seeing Fast Ethernet speeds. The lights on my Linksys GS105 switch indicated 100Mb/s connections. Unplugging and re-plugging cables would sometimes change the indicators to Gigabit, and other times it would simply kill the connection. I finally unplugged the switch for thirty seconds, plugged it back in, and suddenly had Gigabit speeds.

Posted in Linux, Technology | Leave a comment

Visual Studio Debugging – Accessibility Applications with uiAccess Attribute Set

Debugging the application from Visual Studio first resulted in a dialog telling me I needed to restart Visual Studio as Administrator. After doing so, trying to run in the debugger resulted in an error that read, “Unable to start program” and “The application manifest has the uiAccess attribute set to ‘true’. Running an Accessibility application requires following the steps described in Help.” Of course, the ‘Help’ button takes me nowhere useful.

Running the application from Explorer resulted in a dialog that said, “A referral was returned from the server.”

  1. Find the signed executable in Windows Explorer.
  2. Right click and select Properties.
  3. Select the Digital Signatures tab.
  4. Double-click the signature.
  5. Click View Certificate.
  6. Click Install Certificate.
  7. Click Next.
  8. Select “Place all certificates in the following store”.
  9. Select a “Trusted Root Certificate Authorities”.
  10. Click OK.
  11. Click Next.
  12. Click Finish.
  13. In the Security Warning dialog, click Yes.

This allows the executable to be run directly. Unfortunately, it still doesn’t fix the issue with debugging from Visual Studio. It appears that the only solution is to run the executable directly, then attach the debugger to the running process!

Posted in Technology, Windows | Leave a comment

Exporting Visual Studio-compatible Bitmaps from GIMP

Gimp_ExportingForVisualStudio_Options

When exporting bitmaps from GIMP, two options must be selected in order for the bitmaps to be compatible with Visual Studio:

  1. Under Compatibility Options, check “Do not write color space information”.
  2. Under Advanced Options, select “R8 G8 B8” under 24 bits.
Posted in Technology, Windows | Leave a comment