M3 picking lists in Google Glass @ Inforum

I am very pleased to announce that after months of working here and there in the evenings voluntarily after work hours, I finally completed and presented both my demos of M3 picking lists in Google Glass and Augmented Reality at Inforum. They were a success. I showed the demos to about 100 persons per day during six days flawlessly with very positive reception. The goal was to show proof of concepts of wearable computers and augmented reality applied to Infor M3. My feet hurt.

Features

This is my second Glass app after the one for Khan Academy.

This Glass app has the following features:

  • It displays a picking list from Infor M3 as soon as it’s created in M3.
  • For each pick list line it shows the quantity (ALQT), item number (ITNO), item description (ITDS), and stock location (WHSL) as aisle/rack/level.
  • It displays the pick list lines as a bundle for easy grouping and finding.
  • It shows walking directions in the warehouse.
  • It has a custom menu action for the picker to mark an item as picked and to change the status of that pick list line in M3.
  • It uses the built-in text-to-speech capability of Glass to illustrate hands-free picking.
  • It’s bi-directional: from M3 to Google’s servers to push the picking list to Glass, and from Google’s servers to M3 when the picker confirms a line.
  • The images come from Infor Document Management (formerly Document Archive).
  • I developed the app in Java as an Infor Grid application.
  • I created a custom subscriber and added a subscription to Event Analytics to M3:MHPICL:U.
  • It uses the Google Mirror API for simplicity to illustrate the proof-of-concept.

I have been making the resulting source code free and open source on my GitHub repository, and I have been writing the details on this blog. I will soon post the remaining details.

Acknowledgements

I want to specially thanks Peter A Johansson of Infor GDE Demo Services for always believing in my idea, his manager Robert MacCrate for providing the servers on Infor CloudSuite, Philip Cancino formerly of Infor for helping with the functional understanding of picking lists in M3, Marie-Pascale Authié of Infor Pre-Sales for helping me setup and create picking lists in M3 and for also doing the demo at Inforum, Zack Makris of Infor Labs for providing technical support, Jonathan Amiran of Intentia Israel for helping me write the Grid application, and some people of Infor Product Development that chose to remain anonymous for helping me write a Java application for Event Hub and Document Archive. I also want to specially thank all the participants of Inforum whom saw the demo and provided feedback, and all of you readers for supporting me. And I probably missed some important contributors, thank you too. And thanks to Google X (specially Sergey Brin and Thad Starner) for believing in wearable computers and for accelerating the eyewear market.

Screenshots

Here below are the screenshots from androidcast. They show the bundle cover, the three pick list lines with the items to pick, the Confirm custom menu action, the Read aloud action, and the walking directions in the warehouse:

result0_ result1_ result2_ result3_ result3c_ result3r result4_

Vignettes

Here below are three vignettes of what the result would look like to a picker:

1 2 3

 

Inforum

Here are some photos at Inforum:

In the Manufacturing area:
20140917_162028_998

 

 

In front of the SN sign:
10704143_10152726242211873_7070689785986702196_n

 

Holding my Augmented Reality demo:
IMG_0007

Playing around with picking lists in virtual reality (Google Cardboard, Photo Spheres, and SketchFab):
bild 3

Playing around with picking lists in Android Wear (Moto 360):
20140915_110737_664_

 

That’s it! If you liked this, please thumbs up, leave a comment, subscribe to this blog, share around you, and come help me write the next blog post, I need you. Thank you!

Augmented Reality for M3 – Hello World with Metaio Creator

Here is a Hello World illustration of Augmented Reality (AR) for Infor M3 using Metaio Creator and the Junaio Browser on my iPad. The demo shows a 3D warehouse with aisles, racks, and levels, where I highlighted one of the boxes in red. This new result complements my previous demo of AR for M3 which was implemented programmatically in JavaScript. This time I am using Metaio Creator.

Why it matters

The idea is to highlight the stock location of the the next item to pick in a picking list so the picker can quickly identify where to go in the warehouse. This scenario is specially useful for temporary workers that are hired for campaigns on short notice and are not yet familiar with the warehouse thus saving costs in training and picking time.

Also, Augmented Reality is predicted to be one of the next multi-billion dollar industries in five years from now, so this is one of the learning steps I am taking in that direction.

Preview the demo

To preview the demo on your device (PC, Mac, iPad, Android) follow these instructions:

  1. Print the following satellite picture in full page or bigger, and place it on a flat surface; that will be the trackable AR marker:
    GEI142
  2. Install the Junaio Augmented Reality Browser app on your device (from junaio.com for PC/Mac, from the App Store for iPad, or from the Google Play Store for Android).
  3. Open the app and click Scan.
  4. Scan the following QR code; Junaio Browser will identify the QR code, and will download the resources from my channel ThibaudWarehouse3D:
    QRcode
  5. Point your device’s camera towards the printed satellite picture. Junaio Browser will track the satellite picture and will register the warehouse 3D accordingly. Here is a screenshot of the result:
    vlcsnap-2014-07-01-02h18m04s68

How I built it

The creation process is simple.

I used my previous 3D model of a warehouse with racks, aisles, levels, and boxes that I had created in SketchUp for a demo three years ago. I removed the walls and roof. I removed unnecessary 3D elements that slow down the 3D rendering pipeline on iPad. And I hard-coded an arbitrary box in red.

Here is a screenshot of the trackable and 3D model in Metaio Creator:
vlcsnap-2014-07-01-02h31m15s41

Here is a screenshot of the channel creation:
Channel

Here is a video of the entire creation process and preview:

Summary

That was how to create a simple Hello World demo of Augmented Reality for M3 using a 3D warehouse and Metaio Creator to highlight the stock location of the next item to pick in a picking list to save training time and picking time.

Future version

In a future version, I will un-hard-code the red box, and I will highlight it programmatically using Metaio SDK.

That’s it! Like. Comment. Subscribe. Share. Author.

Thank you.

 

Augmented World Expo 2014

Last week I attended the Augmented World Expo (AWE) 2014 [1] in Santa Clara, one of the world conferences on Augmented Reality, Virtual Reality, Augmented Virtuality [2], and smart glasses [3]. There, I saw Steve Feiner, pioneer of Augmented Reality in the 1990s [4] [5], Professor of computer science and director of the Computer Graphics and User Interfaces Lab at Columbia University, and adviser for Space Glasses at Meta [6]. I also saw Mark Billinghurst, director of the HITLab in New Zealand [7] whom created the AR Toolkit which I later used (JavaScript port) for my prototype M3 + Augmented Reality. I didn’t see Steve Mann, also adviser for Meta, and one of the pioneers of the Wearable Computing group in the Media Lab in the 1980s [8]; Thad Starner was in that group and later went on to design Google Glass for Sergey Brin [9]. I got inspiration from their work when I was younger, and I was excited to see them.

I went to the conference to learn more about the future. I’m currently working on a personal project to develop an app to display picking lists in Google Glass with data from Infor M3.

Here are some pictures of me at the conference, dreaming my vision of future picking lists 😉

1 2 AWE2014

Google Glass

I just applied to get a pair of Google Glass.

Google Glass is an anticipated product from Google X for bringing Augmented Reality to the masses in a sports fashion pair of glasses containing a video camera, a Heads-Up Display, a processing unit running Android, Wifi connectivity, and a battery (c.f. the patent).

I was at Google I/O 2012 were they accepted pre-orders for Glass Explorer Edition but I made the regretful decision to not apply. Google is now offering a second chance: What would you do if you had Glass? Answer with #ifihadglass.

If I had Glass I would improve the workers job in a warehouse: I would show walking directions to the picking location, I would display information about the item, and I would keep track of the picking list. I’m an enthusiast I/On working on AR in the enterprise.

This would be a continuation of my previous implementation of Augmented Reality for M3.

Here are my three concepts pictures for Google Glass:

1

2

3

Here’s my application:

4

Wish me luck, and see you at Google I/O 2013.

 

M3 + Augmented Reality

In this article I introduce the first implementation that I know of Augmented Reality for Infor M3. Augmented Reality is the ability to superpose digital information on top of real world objects. This is achieved by locating the user’s head in space, by determining the user’s point of view, by registering real world objects, and by projecting virtual 3D objects accordingly. Implementing it has been a deer dream of mine. In this example I use fiducial markers and data coming from Item Master – MMS001.

Applications

Augmented Reality for M3 could be used for many applications. For example, it could help a worker find an Item in the warehouse by showing optimized walking directions and distance to possible picking locations. Also, it could help a worker show contextual information at a glance.

I believe Augmented Reality to be a disruptive technology and one of the next big revolutions in the software industry, with positive impacts similar to those of the Internet and mobile devices, that will reshape entire industries in the next 10 years.

Timeline & motivation

In 1998 I got a summer job in a warehouse for a company that sold car brakes. Every few minutes a printer spit out a picking list of items that I had to collect. As a temporary worker unfamiliar with the place, I spent most of my time wandering through the warehouse, searching for the items, and asking the more seasoned workers for help; I found that inefficient and I wished the computer gave me a map with directions of where to go. Also, the picking lists were un-ordered and I often had to go back to a previous location I had just visited; I found that inefficient and I wished the computer optimized the picking lists. Also, once I found the location, I often discovered the boxes were empty and I had to ask a forklift driver to replenish the stock location from a box of a higher shelf; I found that inefficient and I wished the computer planned replenishment ahead of time. That was in 1998 and nowadays ERP and Warehouse management systems are more common. Yet, I kept my wish to make better systems.

Then, In 2001 I read about Professor Steven Feiner’s Augmented Reality KARMA project from 1992 at Columbia University. The system fit in a backpack and had portable computer, batteries, GPS, compass, and head-mounted display. It would give detailed instructions to a user on how to repair a printer. That was my first exposure to Augmented Reality and ever since I have been wanting to implement it.

In 2007 Apple introduced the iPhone, with a stunning user interface, graphics, and processing power, blowing everybody’s mind about mobility and redefining an industry. And in 2009 Apple added a camera to the iPhone 3GS. The hardware technology for Augmented Reality started becoming accessible to the masses.

In 2009 I met with Brad Neuberg of Google at the Google I/O conference and I started working on a client-side search engine for M3 source code. That was my first exposure to HTML5.

In 2010 I implemented my first Warehouse 3D demo using Google Earth, with real data fed from the ERP, and I projected the result on a large touch screen for an immersive experience. That was my first step towards implemented Augmented Reality for M3.

In 2011 I proposed an idea for an internal project for M3 + Augmented Reality on mobile devices.

In parallel, WHATWG and W3C have been working hard to standardize HTML5 with the ability to use the webcam in JavaScript with WebRTC, to access pixel data, to paint on the canvas, and to use WebGL for 3D rendering. The software technology for Augmented Reality is becoming accessible to the masses.

More recently I started working on geo-locating Stock Locations in M3. This opens the door to new applications for geo-coded data in M3.

Then, at the Google I/O conference this year, I met with Ilmari Heikkinen whom pointed me to his article in HTML5 Rocks on Writing Augmented Reality Applications using JSARToolKit. That was the last push I needed to implement actual Augmented Reality for M3. So I did.

Implementation

I used Ilmari’s source code and I added a few lines of code to call an M3 API using REST in JavaScript when a marker is detected. In this example, the marker is mapped to an Item number (ITNO), but it could also be mapped to a Stock Location (WHSL) for example. Then, for that Item number I call the M3 API MMS200MI.GetItmBasic and I display the Name (ITDS), Description (FUDS), Basic unit of measure (UNMS), Volume (VOL3), Net weight (NEWE), Gross weight (GRWE).

Result

Here is a video of the result. Note the section below the canvas that shows M3 data coming from MMS200MI.GetItmBasic for the detected marker. We can see an activity indicator flickering as the markers are detected. For best viewing, watch the video in YouTube, in HD, and in full screen.

Source code

I provide the result for download at http://ibrix.info/ar/demo.zip with HTML and JavaScript source code, sample fiducial markers, and sample images.

Future work

With the simple example I introduced in this article I illustrate that hardware and software technology for Augmented Reality have have already become accessible for the masses. The technology is still maturing. There are on-going projects to provide registration without the use of markers. Also, sensors are becoming better for indoor location.

That’s it for now.

Please click ‘Follow’ to subscribe to my blog.

Geocoding of Stock Locations in MMS010

Here is a video that illustrates the process to set the Geo Codes XYZ of Stock Locations in MMS010 in Smart Office, i.e. to set the latitude, longitude, and altitude of Stock Locations, a.k.a. geocoding. In my example I determined the coordinates based on an 3D model built in Google SketchUp and geo-located in Google Earth; a GPS receiver with good indoor accuracy would work as well. With geocoded information, we can present data from the Warehouse Management System in a graphical way. This is important for applications such as showing Stock Locations on a map, or finding the shortest path for a picking list.

Demo video

How to proceed

These are the steps I followed in the video to geolocate the Stock Locations in MMS010:

  1. I used this SketchUp model of a 3D warehouse that I had previously geo-located:
  2. I also used this other SketchUp model of the Stock Locations that I had previously uniquely identified:
  3. Then, I used this Ruby script to get the geocoding of the floor plan:
  4. Then, I used this other Ruby script to get the geocoding of each Stock Location:
  5. The result is this CSV file of the floor plan’s geocodes and each Stock Location’s geocodes:
  6. Then, I used this Lawson Web Service of type Display Program to set the values for the fields Geo Code X (GEOX), Geo Code Y (GEOY), and Geo Code Z (GEOZ) in MMS010/F for a specified Warehouse (WHLO) and Stock Location (WHSL):
  7. Then, I used a Visual Basic macro for Microsoft Excel to call the Web Service for all Stock Locations:
  8. Finally, I used this script to display the Geo Codes XYZ in MMS010/B1:

Result

The result is the list of Stock Locations in MMS010/B1 displaying all the Geo Codes XYZ:

Resources

  • Download the SketchUp model of the geo-located 3D warehouse.
  • Download the SketchUp model of the uniquely identified Stock Locations.
  • Download the Ruby script to get the geocoding of the floor plan.
  • Download the Ruby script to get the geocoding of each Stock Location.
  • Download the resulting CSV file of all Stock Locations and their Geo Codes.
  • Download the Lawson Web Service to set the Geo Codes XYZ of a Stock Location.
  • Download the script to display the Geo Codes XYZ in MMS010/B1.
  • Watch the video of the entire process.

Related articles

UPDATE

2012-09-28: I had a bug in the Ruby script that miscalculated the Y and Z geocodes for the Stock Locations. I corrected the script and the resulting CSV file and I updated the links above.

Warehouse 3D demo

I implemented a Warehouse 3D demo that demonstrates the integration capabilities of M3 with cool stuff from the software industry.

The Warehouse 3D demo displays racks and boxes with live data coming from Stock Location – MMS010, and Balance Identity – MMS060. The Location Aisle, Rack, and Level of MMS010 is written dynamically on each box. The Status Balance ID of MMS060 is rendered as the color of the box: 1=yellow, 2=green, 3=red, else brown. And the Item Number is generated dynamically as a real bar code that can be scanned on the front face of the box.

Here is a screenshot:

The demo uses the Google Earth plugin to render a 3D model that was modeled with Google SketchUp, Ruby scripts to geocode the boxes and identify the front faces of the boxes, PHP to make the 3D Collada model dynamic, SOAP-based Lawson Web Services that calls M3 API, and the PEAR and NuSOAP open source PHP libraries.

The result is useful for sales demos, and as a seed for customers interested in implementing such a solution.

Try for yourself

http://ibrix.info/warehouse3d/

You can try the demo for yourself with your own M3 environment. For that, you will need several things. You will need to install the Google Earth plugin in your browser. You will also need to deploy the Lawson Web Service for MMS060MI provided here; note that your LWS server must be in a DMZ so that the http://www.ibrix.info web server can make the SOAP call over HTTP. Also, you will need to follow the Settings wizard to setup your own M3 environment, user, password, CONO, WHLO, etc. The result is a long URL that is specific to your settings.

Constructing the 3D model

I built a 3D model with Google SketchUp.

Here is the video of the 3D model being built in Google SketchUp:

You can download my resulting SketchUp model here.

Identifying Aisles, Racks, Levels

Then, I set the Aisle, Rack, and Level of each box as in MMS010 using a custom Ruby script for Google SketchUp.

Here is a video that shows the script in action:

You can download this Ruby script here.

You can download the resulting SketchUp model here.

Identifying front faces

Then, I identified each front face of each box so as to dynamically overlay information, such as the Item Number, Item Name, etc. For that, I implemented another Ruby script.

Here is a video of that process:

You can also download this Ruby script here.

Exporting the Collada model

The original model is a SKP filetype, which is binary. I exported the model to a Collada DAE filetype, which is XML. The file is very big, 30.000 lines of XML.

The Collada file contains this:

  • Components (racks, boxes, walls, etc.)
  • Homogenous coordinates (X, Y, Z, H) relative to the model
  • Absolute coordinates (latitude, longitude)
  • Orientation (azimut, etc.)
  • Scale
  • Effects (surface, diffusion, textures, etc.)
  • Colors in RGBA

From the top of my head, the Collada hierarchy in XML is something like this:

Node Instance
	Node Definition
		Instance Geometry
			Instance Material
				Material
					Instance Effect
						Color
						Surface
							Image

Making the model dynamic

The goal is to set the color of each box dynamically, based on the Location of the box, and based on the Inventory Status in MMS060.

Unfortunately, Google Earth doesn’t have an API to change the color of a component dynamically. So, I decided to change the XML dynamically on the server. There are certainly better solutions but that’s the one I chose at the time. And I chose PHP because that’s what I had available on my server ibrix.info; otherwise any dynamic web language (ASP, JSP, etc.) would have been suitable.

In the XML, I found the mapping between the box (nodeDefinition) and its color (material). So, I changed the mapping from hard-coded to dynamic with a PHP function getColor() that determines the color based on the Location and based on the result of the web service call.

The color is determined by the Balance ID: 1=yellow, 2=green, 3=red, else brown. The Balance ID is stored in the SOAP Response of the web service.

Lawson Web Service

I created a SOAP-based Lawson Web Service for MMS060MI. I invoke the SOAP Web Service at the top of the PHP script, and store the Response in a global variable. To call SOAP Web Services, I use NuSOAP, an open source PHP library.

Generating front faces

I dynamically generate a texture for each each front face as a PNG image with the Item Number, Item Description, Quantity, and the bar code. I set the True Type Font, the size, the XY coordinates, and the background color.

Bar code

I generate an image of the bar code based on the Item Number using PEAR, an open source PHP library.

Settings wizard

I made a Settings wizard to assist the user in setting up a demo with their own M3 environment, user, password, CONO, WHLO, etc.

Applications

This Warehouse 3D demo illustrates possible applications such as:

  • Monitoring a warehouse
  • Locating a box for item picking
  • Implementing Augmented Reality to overlay relevant data on top of the boxes

Demo

Finally, I made a demo video using the back projection screen at the Lawson Schaumburg office, and using Johny Lee’s Low-Cost Multi-point Interactive Whiteboards Using the Wiimote and my home made IR pens to convert the back projection screen into a big touch screen. The 3D model in the demo has 10 Aisles, 6 Racks per Aisle (except the first aisle which only has 4 racks), and 4 Levels per Rack. That’s 224 boxes. There is also a floor plan that illustrates that structure.

Limitations

The main limitation of this demo is performance. When programming with Google Earth we do not have the capability of dynamically changing a 3D model. I would have liked to dynamically set the color of a box, and dynamically overlay text on the face of a box. Because that capability is lacking – there’s no such API in the Google Earth API – I chose to generate the XML of the 3D model dynamically on the server. As a result, the server has to send 30k lines of XML to the web browser over HTTP, it has to generate 224 PNG images and transfer them over the network, and the Google Earth plugin has to render it all. As a consequence, it takes between one and four minutes to fully download and render the demo. This design turns out to be inadequate for this type of application. Worse, it is not scalable nor improvable. I would have to re-think the design from scratch to get a more performant result.

Future Work

If I had to continue working on this project (which is not planned), I would implement the following:

  • Ideally, we would generate boxes, colors, and text dynamically on the client-side, with JavaScript and WebGL for example. Google Earth doesn’t support that, and generating the model on the server-side turns out to be a bad design. So we need a different technique.
  • Also, we could use a better 3D client, like O3D.
  • Also, we would need to implement easy keyboard navigation, like the First Person Camera demo, and like the Monster Milktruck demo.
  • Also, we would need to implement hit detection, so as to click on a box and display more M3 data in a pop-up for example. Google Earth supports even listeners but doesn’t yet support hit detection.
  • Finally, we would need to improve performance by an order of magnitude.

Thanks

Special thanks to Gunilla A for sponsoring this project and making it possible.

Resources

  • Download Ruby script to set the Aisle, Rack, and Level of each box as in MMS010
  • Download Ruby script to identify each front face of each box so as to dynamically overlay information
  • Download SketchUp model with floor plan, geo-location, racks, and walls
  • Download SketchUp model of boxes identified by Stock Location
  • Watch video of the 3D model being built in Google SketchUp
  • Watch video of the process of setting the Aisle, Rack, and Level of each box as in MMS010
  • Watch video of the process of identifying each front face of each box
  • Watch video of the demo on the large touch screen
  • Download Lawson Web Service for MMS060MI

Related articles

UPDATE

2012-09-27: I added the SketchUp models and Ruby scripts for download.

M3 + Augmented Reality (idea)

Here is an idea that would be great to implement: M3 + Augmented Reality. I believe AR to be one of the next big revolutions in the software industry, and the technology is available today. We have mobile phones with cameras, GPS, compass, and millimetric indoor radio positioning, fast CPU for feature registration, localization and mapping, REST Web Services, etc. It went from being mostly reserved to research labs, to being the hype of emerging start ups. Get ready for the future 🙂