Here is the first tangible result of M3 Picking lists in Google Glass. This is a continuation of my previous posts in the series: Getting and processing an M3 picking list and Hello Google Glass from Infor Process Automation.
As a reminder, I’m developing a proof-of-concept to show rich picking lists from Infor M3 in Google Glass with item number, item description, quantity, stock location, image of the item from Infor Document Archive, and walking directions in a warehouse plan. Also, for interactivity, the picker will be able to tap Glass to confirm the picking. Also, the picker will be able to take a picture of a box at packing.
The result will be useful to showcase the integration capabilities of M3, it’s a first implementation of wearable computing for M3, and it sets a precedent for future Augmented Reality experiments with M3. And the advantages for a picker in a warehouse would be more efficiency and new capabilities. And for me it’s a way to keep my skills up-to-date, and it’s an outlet for my creativity. It’s a lot of software development work, and I’m progressing slowly but steadily on evenings and on week-ends. If you would like to participate please let me know.
Why it matters
According to Gartner, by 2016, wearables will emerge as a $10 billion industry .
According to Forbes, “Smart glasses with augmented reality (AR) and head-mounted cameras can increase the efficiency of technicians, engineers and other workers in field service, maintenance, healthcare and manufacturing roles” .
According to MarketsandMarkets, the Augmented Reality and Virtual Reality market is expected to grow and reach $1.06 billion by 2018 .
Basics of Google Glass
Google Glass is one of the first wearables for the mass market. It is a notification device on the face, an eyewear with all the capabilities of an Android device including camera, head-mounted display, touchpad, network connectivity, voice recognition, location and motion sensors.
It works by displaying a timeline of cards we can swipe back and forth to show past and present events.
To write applications for Google Glass we can use the Mirror API, the Glass Development Kit (GDK), or standard Android development. I will use the Mirror API for simplicity.
I will display an M3 picking list as a series of detailed cards on the timeline that pickers can swipe and interact with like a to-do list as they are progressing in their picking.
For the template of the timeline card, I will use SIMPLEEVENT from the Google Mirror API Playground per picking list line to easily identify the four pieces of information the picker will need to read per picking list line: item quantity, item number, item description, and stock location:
I will use Bundling with bundleId and isBundleCover to group the cards together by picking list:
Picking list lines
I will get the picking list lines with the SQL of my previous post, and I will sort them in descending order because Glass will display them reversely last in, first out, thus they will appear naturally in-order to the user.
SELECT H6WHSL, H6ITNO, H6ITDS, H6ALQT
WHERE H6CONO=<!CONO> AND H6DLIX=<!DLIX> AND H6PLSX=<!PLSX>
ORDER BY H6WHSL DESC
I will use the SIMPLEEVENT template’s HTML fragment and replace the sample values with item quantity ALQT, item number ITNO, item description ITDS, and stock location WHSL:
I will embed the HTML fragment in the JSON payload for the Mirror API:
The bundle cover will be:
"text": "Picking list <!DLIX>",
Infor Process File
My process file in Infor Process Designer is illustrated in this GIF animation (you can open it in Gimp and see the layers in detail):
I had to solve the following two trivial problems:
I still have the following problem:
- The subscription I used in my previous post, M3:MHPICL:U, seems to occur too early in some of my tests, and that is a blocking problem because when my flow runs the SQL to get the picking list lines only gets the first line – which is the only line that exists in the database at that point in time – while the other lines haven’t yet been created in the database at that time and the flow misses them. I must find a solution to this problem. I haven’t been able to reproduce it.
From my previous post, I had the following picking list:
When I re-run the scenario, here are the resulting timeline cards in my Google Glass where black pixels are see-through pixels:
And here is a video capture of the result (I used Android screencast which captures at a slow frame rate):
And here is a picture of what it would look like to the user in Glass with see-through pixels:
Next, I will implement the following:
- Programmatically get the OAuth 2.0 token instead of copy/pasting it manually in the WebRun activity nodes.
- Show the item image from Infor Document Archive.
- Show walking directions on a warehouse plan.
- Tap to confirm the picking.
- Take a picture of the box at packing.
That’s it! Check out my previous posts on the project. Like this post. Tell me what you think in the comments section below. Share with your colleagues, customers, partners. Click the Follow button to subscribe to this blog. Be an author and publish your own ideas. And enjoy.