Don’t Even Read This

It’s been too long since making a post. I think the three or four people who read this blog are getting bored checking to see if I’ve made a new post. So here it is.

I’ve been distracted from an iOS app I’ve been working on. The distraction involves having to do a custom view to have different layouts between portrait and landscape orientations. UI is not exactly the most exciting thing for me. The distraction, on the other hand, has proven to be too much of a distraction. I had this idea that should work. At least, I think it should. My math skills are a bit iffy. But intuitively, I think Shannon’s Law allows it.

Let me post a bunch of code and then explain what it is I’m trying to do. Bear in mind that this is strictly an intellectual exercise with no intrinsic or economic value.

[gist user=”DavidSteuber” id=”78b2c6a83932b0e307d8″]

Now for a quick run through.

qrCodeWithMessage creates a QR Code from an NSData object. This is straight up binary data. The idea is that the data would be the on disk representation of a PNG file that displays the QR Code. If a QR Code scanner is able to display binary data and discern that the data is a PNG file, it should display the image which is the QR Code. At least that is the goal here. The current code doesn’t work. I’ll get into that.

colorSpaceIndexed is supposed to create a two color pallet in B&W for a one bit per pixel PNG file. I don’t get any errors with it. But it doesn’t seem to work. I don’t know why. I’ll post links at the bottom of the article about where I’ve asked how to get things to work.

createBitmapContext is the drawing contest for the CIImage that is produced by qrCodeWithMessage. A CGContext is required to create the pixel data for an image that can then be drawn using Quartz. This is a gray scale context. One bit per pixel contexts are not supported.

createCGImage produces a CGImage using the CGContext. At this writing, it is a gray scale image. This chews up 8 bits per pixel. Even with compression, this is too much data when the image only has the two colors.

saveImage does pretty much what it says. The NSData which contains bytes representing a PNG file is saved to disk.

getImageFileData simply gets the bytes from the NSData object that represents a PNG file. This is so I can compare the bytes for equality.

compare is the function I use to do the comparison. It’s pretty much like the standard Unix function. It tests for equal length first and then does a byte for byte test.

main makes it all go.

The problem I’ve had is that I have been unable to find out how I’m supposed to create an indexed PNG image. PNG supports such a representation. I use it as a QRCode (generated from a web site) that points to my blog site.

QR Code
QR Code

If you inspect the properties of this image, you can see that it is a 1 bit per pixel indexed image. This is the format I’m trying to create with my Swift code using Quartz.

What I get instead is this, an 8 bit gray scale image with no alpha channel that exceeds the amount of data that can be stuffed into a QR Code.

8 bit gray scale PNG file
8 bit gray scale PNG file

This is not what I want.

Too be fair, even if I got the 1 bit per pixel file format, I’m not certain that it would fit into a QR Code. I also don’t know if I will converge on a QR Code that represents itself as a PNG file. But it would be very cool if it did.

This thing is taking up my waking mind. I’m at war with Quartz to get the file representation that I want. I’m actually tempted to take zlib and libpng and use them to do the work. But this would be insane since it is obvious that OS X and iOS have no problems with drawing a PNG file of the type I want to produce.

So How do I produce the PNG file that I want using Quartz?

Boards I’ve posted my question on:

Yes, Back To Basics

The problem I’ve been having with the UICollectionView has been solved by vacawama on Stack Overflow. The question I posted, along with the responses I got are here:

It has been pointed out, correctly, that my manner of asking the question was not ideal (paraphrasing here). I won’t excuse myself. Still, I couldn’t think of any other way to do it. I had no idea where the problem lay. I didn’t think pulling from GitHub would be a huge burden. I was probably wrong on that point even though vacawama took it up.

Now I can go back to the app that really matters and fix the collection view there. I’ve also committed the fix, with credit, to GitHub. So anyone with any interest in a modern way to implement Apple’s CollectionView-Simple example in Swift 1.2 can pull down that project and play with it.

It’s something of a stress relief that the problem was solvable. I just wish I knew how the view options got screwed up.

Back To Basics?

I’ve been working on a simple (relatively speaking) app that I thought would be useful at conventions and such. The key UI element in this app is a UICollectionView. Apple has a demo app for using it called CollectionView-Simple. It works just fine as given. However, I’m using Swift 1.2. So I translated the example with a project starting from scratch.

It doesn’t work.

OK, it sort of works. But the collection view does not draw properly. It also draws differently on different devices. But the thing that is really bugging me (apart from not being able to solve this little issue) is the device rotation. The view is supposed to redraw itself to fit the rotation. It’s not doing that.

To be honest, this is driving me mad. I’m sure I’ve looked at every option in the storyboard editor (Interface Builder) to make sure that all the settings are the same as in the demo app. I can’t find a discrepancy. Except for rotating the device to a different orientation, the app works properly (so far as I can tell).

The WWDC is coming up in June. It would be nice to have my app on the iTunes store by then. But I must confess that little issues like this really do drag me down mentally. If there is anyone out there who can figure out the issue, I sure would appreciate it.

Here is the project on GitHub.