Author Archives: Nigel Barron

Artisans on the blockchain

There are literally hundreds of thousands of locally produced products in Europe that are unique to a country, region, town or even a village. These products rely on their authenticity as a sign of quality, when we know without a doubt that a product was produced from an authentic source we know the quality is guaranteed. How can these unique products, sometimes community or family produced, verify their authenticity?

Blockchain technology has already been deployed by food companies in their supply chains, in August 2017, ten of the world’s biggest CPG and food companies partnered with IBM to integrate blockchain into their supply chains. This cohort — Walmart, Nestle, Unilever, McCormick, Tyson, Kroger, McLane, Driscoll’s, Dole, and Golden State Foods — represents more than half a trillion dollars in aggregate annual global sales[i]. However, smaller producers should also be able to benefit from blockchain’s ability to store and share information across a network of users in an open virtual space. For the purposes of this research we are going to concentrate on Spain’s jamón ibérico, or Iberian ham.

Iberian ham comes in many forms, but to be defined as jamón ibérico de bellota (acorn-fed Iberian ham) it must first come from Iberian blackfoot pigs, or from 50% crossbreeds. These pigs must then spend several months of the year roaming the dehesa, a pasture planted with oaks, feeding on grass and acorns. During the last few months before being slaughtered they must live exclusively on this diet. Only 6% of total production comes from 100% pure black Iberian pigs raised on dehesas and these are concentrated in two very specific parts of Spain: Andalucía and Extremadura. Nine out of every 10 black labels come from these two regions.

As if all the labeling were not confusing enough, there are now question marks regarding the purity of the animals themselves. There are 435,000 registered specimens with the Spanish Association of Iberian Pig Breeders (Aeceriber). Around 341,000 of these were incorporated following a 2014 government decree giving breeders two years to register all genetically pure animals. But instead of demanding expensive DNA tests (costing at least €20 a head), the decree authorized the registration with nothing more than a visual inspection by an expert veterinarian.

This is big business. According to the statistics published by the Chinese Customs Administration, in 2017 the sales of pork products from Spain to China suffered a reduction, dropping by 7.1% in terms of volume and by 11.5% in terms of value with respect to 2016, the total figures being 372,985 tonnes with a value of $648.6 million (€574.2 million). The main reason was the rise in production in China and a drop in prices after an exceptional 2016 in which all Chinese pork imports (from all origins) reached the record figure of 3,132 million kilos. Nevertheless, the Spanish exports were less affected than those from all origins as a whole, because the total pork imports in China fell by 17% in terms of volume and by 23% in terms of value with respect to the previous year. Therefore, Spain had a better market share in 2017 and reached the first place as meat supplier, beating Germany, Canada and the USA. The Chinese are learning breeding production methods fast.

Figure 1Evolution of the sales of pork products from Spain to China in terms of volume and value.

At present The European Commission guarantees the tradition, health and quality of each ham, covered by Protected Designation of Origin certificates that deem the product has been produced according to traditional methods and within a defined geographic region and special localisation. The PDO seal also serves to distinguish these fine foods from lower quality imitations and guarantees authenticity. Why not put the PDO on the Ethereum blockchain? Why not approach the breeders and breeder’s associations in Andalucía and Extremadura with an Ethereum blockchain solution regarding their authenticity?

There are thousands of examples of artisan products in Spain alone which could benefit from verifying their authenticity; olive oil, Rioja wine, the endless varieties of cheese, fruit, nuts and other food and beverages. And that’s just the gastronomic delights that Spain has to offer, other products include ceramics, linen, leather, marble, musical instruments and more.

Verifying the authenticity of locally produced products will be as vital as the protection of our own personal online data in the years to come, without it imposters could drain millions of Euros from rightful owners and have the potential to destroy entire communities.

Robotic Process Automation Disruption

‘We are seeing work with clients today which is very much around big data and robotic process automation (RPA), where in compliance — take anti-money laundering — you can take out thousands of roles,’. ‘That is coming quite quickly now and that will sweep across the industry.

 

Companies have really thrown bodies at this to deal with the demands of the regulators. They have had no option,’ ‘But now we are shifting from a revolution of labor arbitrage and offshore to a revolution of automation around this.’

 

  • Richard Lumb, Head of Financial Services at Accenture, quoted at the World Economic Forum by the Financial Times.

 

Among the many ‘mini-revolutions’ moving through IT services, robotic process automation (RPA) is among the most significant, as the statements above demonstrate. Lumb says that many of the jobs created by banks in recent years for compiling and checking data on customers and transactions had already been moved offshore to lower-cost countries. In the next wave of automation, these jobs may simply disappear.

 

RPA in Action at Global Banks

Last July American Banker reported that BNY Mellon had introduced RPA into its operations in order to lower costs. “We’ve been piloting robotics and machine learning processes to automate work and eliminate repetitive manual tasks,” said BNY Mellon CEO Gerald Hassell during the bank’s second-quarter conference call. “It’s really taking some of the manual mind-numbing exercises out of the process and doing it at a lower cost.”

 

BNY Mellon’s experience was a bellwether for what was discussed by financial firms during the World Economic Forum at Davos. During a panel on the future of banking, the rise of fintech and the possibility of virtual currencies, John Cryan, CEO of Deutsche Bank talked about the critical role technology will play in the immediate years ahead, as banks strive to reduce costs and execute processes and services in the efficient manner that customers and clients have come to expect.

 

Cryan specifically acknowledged the benefit of implementing technology to manage and monitor internal oversight. “We can use technology to improve our own controls. We can use technology to improve our efficiency and then we can use technology to improve the customer service,” Cryan said.

 

A recent Citibank report states that the banking industry spends $270B – or approximately 10 percent of operating costs – on the handling of compliance and regulation, much of which involves manual work to comply with oversight rules. Citi also estimates that European and US banks have paid more than $150B in litigation and conduct charges since 2011, a figure that the industry no doubt hopes to lower through the use of RPA.

 

 

It’s not only financial services that are benefiting from RPA.  Among the examples cited by Gartner of front-, middle-, and back-office deployments of RPA include:

  • Human Resources tasks such as employee onboarding, recruitment and payroll. RPA is being used to coordinate different types of requests that need to be made when an employee joins a firm — from notifying payroll, to granting IT access and passwords, to having a desk. The benefit to the firm is that one system can be linked to multiple other systems and can eliminate rekeying of data between the systems when looking at the entirety of the activities that a starting employee requires.
  • Customer Management. RPA aids with onboarding, processing rule-based activities triggered from websites, interactive voice response (IVR) systems, web chat and mobile apps, collating data from multiple disparate systems for customer service, orders, and checking systems for price and delivery offers. The benefit of adding in an RPA tool is that customer-service employees can more quickly access all necessary data – including that from the legacy systems of companies acquired in the past.
  • Finance and Accounting. Checking correct input of invoice or order entry data, and collating of reporting data from multiple systems for month- and year-end reports are among the ways organizations use RPA for finance and accounting. The benefit here is that the RPA tool can run a “soft close” each day and alert managers of potential problems.

 

RPA Impacts Service Providers as Well

RPA is also playing havoc with the business models of Indian pure-play companies. While most large, global IT services providers are facing a steady decline in revenue, the India-based firms have resisted this trend with their wage-arbitrage/”bums-on-seats” strategy.  That strategy, however, is facing significant challenges in a Cloud-centric, Digital world, and labor-cutting RPA processes will only add to the woes of India-based firms.

 

Among larger global competitors, Accenture and Blue Prism announced in January that they are working together to provide RPA solutions for clients across industries. ‘More than 40 organizations have already selected Accenture and Blue Prism to help achieve this, including international retailer Circle K and Raiffeisen Bank International’, according to Horses for Sources (HfS).

 

Although not often considered a traditional IT service provider, Deloitte is the strongest riser in the HfS RPA Premier League Table (see below) and this is the type of offering that fits with the firm’s holistic transformation agenda. Specifically, Deloitte is combining its strong client traction across verticals with a shared services development platform out of India. For RPA, Deloitte is pursuing a partner ecosystem approach with the leading RPA vendors, while adding its own Cognitive Automation capabilities.

 

EY, meanwhile, says ‘RPA will quite quickly convert from a differentiator delivering a competitive advantage to a standard practice that needs to be followed for survival’.  EY has embedded RPA into its broader Smart Automation Framework, with a strong focus on transformational projects that move clients toward self-service and CoEs. At the same time, EY emphasizes its close partnership with Blue Prism. Overall, PwC estimates that 45% of work activities can be automated, and that this automation would save $2 trillion in global workforce costs.

 

For its part, NTT Data’s Automated Full-Time Equivalent (AFTE) helps automate repetitive, high volume and rules-based tasks with a suite of more than 50 tools to meet a wide variety of industry and business needs. NTT (who acquired Dell Services in December 2016) combines RPA with a command center.

 

According to NTT Data:

  • ‘With the emergence of robotic process automation technologies, the business process outsourcing industry is able to do more than ever before. But without the ability to measure efficiency and oversee the entire automation cycle, it can be difficult to sustain long-term strategic growth.’
  • ‘To help you fully control, understand and analyze the efficiency of AFTE, we offer a command center featuring real-time statistics, performance monitoring and analytical capabilities. The NTT DATA AFTE Command Center maintains a fully auditable record and monitors the effectiveness of machine-based work. It aggregates tasks, assesses cross-process penetration levels and distinguishes between machine- and human based tasks.’

 

(For more on command centers, see the blog “Why we need to become better listeners – The case for a command center approach”).

 

The larger providers noted above are not the only companies exploring RPA. Horses for Sources (HfS) identified some pure play companies in its 2016 RPA Premier League Table (see below). These smaller organizations are the vanguard of deploying RPA based on their technical expertise. HfS expects that most of them will be absorbed by the leading management consultancies over the next 18 months. We shall see.

 

 

 

Gartner in its Forecast Snapshot: Robotic Process Automation, Worldwide, 2016 report, forecasts the robotic process automation software market ‘will grow by 41% year over year to 2020. Technology product marketing leaders must plan for most growth coming from new deployments, though this growth will be tempered by next-generation AI products.

 

‘Key Findings

  • By 2020, end-user spending on robotic process automation (RPA) software will reach $1 billion, growing at a compound annual growth rate (CAGR) of 41% from 2015 through 2020.
  • By 2020, 25% of organizations using RPA will have two or more RPA software tool types deployed and multiple artificial intelligence (AI) tools.
  • By 2020, RPA tools will have evolved to include more types of functionality, such as AI software, but will experience strong downward pricing pressure.
  • By 2020, 40% of very large global organizations will have adopted an RPA software tool, up from an estimated less than 10% today.’

 

At the 2nd Annual Robotic Process Automation in Shared Services Summit in Chicago in 2016, Swetal Desai, Vice President of Business Process Improvement at HP Enterprise said:

 

  • Robotic Process Automation is no longer a hype
  • RPA should not be run as a technology project; but, a business transformation priority, part of more comprehensive long-term company strategy
  • Process thinking (Eliminate/Simplify/Standardize/Automate) is critical and drives bigger ROI
  • Selecting right processes to automate is key
  • The market, technology and vendors are evolving…stayup-to-date.
  • Don’t try to align processes to a RPA software; Invest in partners/products which are capable to automate the most. There are no “do it all” RPA suites.
  • No one team can do it….it requires management of complex ecosystem of players–business stakeholders, technology and development partners, internal IT organization, Process excellence teams and Finance
  • Do not under-estimate the Management of Change challenges and impact on workforce

 

The evolution of Intelligent Automation in general and RPA, in particular, is one of the most disruptive shifts our industry has witnessed.  2017 will be the year that RPA really shakes up the industry as the labor arbitrage business model is finally laid to rest. RIP.

MITProfessionalX Course Diary – IOTx Internet of Things: Roadmap to a Connected World – Week 5 & Conclusion

Nigel Barron’s Blog: MITProfessionalX Course Di… | Home

Week 5 of the MITProfessionalX IOTx Internet of Things: Roadmap to a Connected World course concluded with the Applications module, specifically:

Beyond IoT – Ubiquitous Sensing and Human Experience (Joe Paradiso)

    • Emerging Descriptive data standards for IoT and sensors
    • Immersive visualization of diverse sensor data using game engines (part of IoT’s ‘control panel’)
    • Wearable sensing for IoT (future user interfaces for IoT – new ways to control and interact with your environment)
    • Sensors and paradigms for seamless Interaction with the Built Environment (lighting, heating, etc.)
    • Smart Tools for IoT
    • Smart, sensate materials

Wireless Technologies for Indoor Localization, Smart Homes, and Smart Health (Dina Kitabi)

      • Smart health
      • Home automation
      • Location tracking

Smart Cities (Carlo Ratti)

      • The city as a cyber physical system
      • Principles of cybernetics: sensing and actuating
      • Collection of information: opportunistic sensing (a)
      • Collection of information: crowd sensing (b)
      • Collection of information: ad hoc sensing (c)
      • Response of the system: analytics and optimization
      • Response of the system: distributed action, people as intelligent actuators
      • Price of anarchy
      • Hacking the city: the risk for cyber attacks in centralized and distributed systems
      • Smart city equals Smart Citizens

The final module of the course (and the previous module) complemented, conveniently, much of what was revealed this week at Google I/O 2016 – Smart Homes, Smart Buildings and Smart Cities. The world is changing very quickly. Things will change in the near future as sensors become ubiquitous and the way we plug into them becomes more and more intimate. The sensors are already out there, piggybacking on the back of devices that are already in place. Sensors are getting cheaper, as the cost comes down everything becomes accessible and the ability to innovate will be widespread. At I/O this year, Google displayed its vision for a more ubiquitous and conversational way of interacting with technology. Its Assistant is chattier, answering natural language queries with a more human voice, and it’s found its way into several new Google products: the messenger Allo and the Echo-like speaker Home. Both are areas where other companies have a lead, but Google’s strength in AI gave these services some nice twists, doing things like automatically generating surprisingly specific reactions to photos. But you don’t have to have Google’s resources to able to play in this space, a RasPi or Arduino can get things going and as Profesor Sarma pointed out in his last slide ‘Just do it, thoughtfully. But do something.

  • Why? Because IoT is in your future, and IoT literacy is essential.
  • IoT is very personal to your company. You need to figure out how it will impact your business.

His roadmap, beyond the ‘walled gardens’ of NEST, HomeKit and Smart Hub is encouraging and his advice applies to all of us and our organisations.

blog5

 

‘Finally, over time, I think that what’s going to happen is we’re going to go to a three-tier architecture. You have the device, you have the cloud, and you have edge computing, if you need performance. And that’s really my prediction for where this world is going to go.
IoT is in the future. Devices you buy will be IoT-enabled. Your homes will be IoT enabled. And it’s going to become a competitive thing.
And so what you need is what I call IoT literacy. It’s a way of thinking, which is how do I instrument and take advantage of it because it is happening. Don’t fight it, in fact, try and win it.
Just imagine if you had fought the cell phone 10 years ago. If you didn’t use your iPhone, your Android phone, or your Microsoft phone. If you didn’t do text messaging. If you didn’t do scheduling on your phone. If you didn’t use Google Maps or Apple Maps, just imagine, you would have been at a disadvantage.
I would use the same thing. I mean if you have a factory that refuses to monitor valves using connectivity, compared to a company that has a factory that does. And if their insurance goes down, you’re at a disadvantage. So it is in your future. I predict it. And don’t fight it.
But it is very personal to you. What I mean by that is when we bring the technology in, let’s say a cell phone. The cell phone is very personal to me. I use it in a way that is different from even a close colleague of mine. For example, I may use certain features more than she does.
My wife and I use our cell phones subtly differently, but within our family we have a certain pattern. We know how to reach each other. We prefer text to a call. IoT is like that.

If your business is your family, you will adapt IoT to your business. Your business probably has an advantage– you do something different and it, gives you an advantage. So IoT has to wrap itself around that, so that you can use IoT to make the thing that makes you different more advantageous.
And so you have to figure out how to use it. Now, I’m not saying don’t work with consultants. But if you work with a consultant, work with a consultant who understands the process. The IT part of it will come later.

But if you start with the IT, you will put the cart before the horse. The IT will dictate what you should be doing as opposed to the process. So figure out your process and figure out precisely where IoT can help you, then let’s figure out the IT.

The next thing I recommend is build a real system and try and use it. I assure you the learnings will be fundamental. And it will give you a very gut-level, visceral IoT literacy that you will need.
And here’s the next thing– be ready to fail, be ready to iterate just as you would with math, just as you would with a new technique, just as you would if you, for example, decided to go buy a bike, and you’ve never ridden a bike. But you’ll figure it out. It’s the same thing. You got to learn to iterate because, again, this is going to be deep in your use, and you’ve got to figure out how this thing works.’

This has been an excellent course and the whole experience has enhanced my understanding of the Internet of Things, its technologies and applications. Taking the course has also prompted three important takeaways:

  • Updating our skills is critical, we must be constantly learning
  • We must get comfortable with change
  • If we don’t take heed of the first two points things could get tricky for ourselves and our organisations. – See Twelve ways to survive the race to irrelevance

MITProfessionalX Course Diary – IOTx Internet of Things: Roadmap to a Connected World – Week 4

Week 4 of the MITProfessionalX IOTx Internet of Things: Roadmap to a Connected World course continued with the Technologies module, specifically:

Security in IoT (Srini Devadas)

  • Why is security for IoT so hard?
  • Threat models
  • Defensive strategies and examples

HCI in an IoT World (Jim Glass)

  • Theory and applications of spoken dialogue for human-computer interaction
  • Combining speech with other modalities for natural interaction
  • Considerations for multilingual interactions
  • Paralinguistic information from speech for enhanced HCI
  • Future challenges for ubiquitous speech interfaces

Robotics and Autonomous Vehicles (John Leonard)

  • Potential benefits of self-driving vehicles and service robots
  • Sensing and data processing
  • Simultaneous mapping and localization
  • Levels of autonomy
  • Future research challenges

Security has been the most complex topic so far in my opinion. Security is a huge challenge and when privacy is added to the complexity its clear they are two very, very important topics in the Internet of Things. Srini Devadas (Professor, MIT Computer Science and Artificial Intelligence Laboratory at Massachusetts Institute of Technology, MS and PhD from the University of California, received the IEEE Computer Society Technical Achievement Award in 2014 for inventing Physical Unclonable Functions and single-chip secure processor architectures.) says ‘security is a challenging problem, because it’s a negative goal’. Prefesor Devadas uses the example of accessing a .txt file. He gives us a myriad of ways that someone could attack another and discover the .txt, and he could keep going on and on. How do you know that you’ve thought of all the ways to stop an attack? You don’t, and that is why security is a challenging problem, because it’s a negative goal.

There are three defensive strategies for IoT systems: prevention, resilience, and detection & recovery. This is where the complexity factor begins to nudge up a quite a few notches, like physical unclonable functions that correspond to protecting integrated circuits from physical attacks to extract secret keys that are stored in the integrated circuits. I had to read the transcript a number of times as the video lecture wasn’t enough to be able to understand the concepts in one sitting. In particular there was a section that was very mathematical and gave examples of the gen algorithm and the Learning Parity with Noise problem. Profesor Devadas uses an interesting analogy to describe the notion of computation under encryption.

Let’s say Alice wants to buy a beautiful ring for herself. Not only that, she wants to design this ring. She is going to hire jewelry workers to create this ring for her and give them raw materials to do this. But there’s a problem here. The problem is one of theft. The jewelry workers could create the ring for her and just walk away with the ring. How could she protect against this scenario?

Alice could create a locked glove box and put her raw materials inside the locked glove box. Alice puts the raw materials in a locked glove box. The jewelry workers are going to put their hands into the locked glove box, work on the raw materials, and create a ring, except that they have no idea that they’re even creating a ring. It’s only Alice that knows that they’re creating a ring for her. The jewelry workers, after they have finished their task, are going to take their hands out of the locked glove box and walk away. Alice willpresumably pay them for their work. But now Alice is going to be able to open up the locked glove box in private and take out her beautiful ring and enjoy it.

Given this jewelry example, let me tell you exactly what happens from a mathematical standpoint. The analogy here is that encrypting is putting raw materials into the locked glove box. So the raw materials correspond to sensitive data that’s associated with Alice’s DNA, for example.

Decrypting is taking things out of the box. As I mentioned, the jewelers have no idea that they’re building a ring. They simply produce an encrypted result in the mathematical domain. Alice is able to take the encrypted results and decrypt it to obtain her diagnosis. The computation is the process of assembling the jewelry, and this corresponds to computing on encrypted data. We need particular mathematical structures corresponding to the encryption and decryption algorithms to ensure that the computation on the encrypted data, to produce an encrypted result, produces exactly the same result as if Alice had computed on the original sensitive data using standard operations.

The profesor gave one example for each of the defensive strategies, ‘There are many other examples. Typically, these examples correspond to different layers of abstraction or correspond to different layers of software and hardware in an IoT system. To build a secure system may require such mechanisms at all layers of abstraction– the compiler, operating system, the application, and the hardware.’

The HCI and Robotics & Autonomous Vehicles lectures were an interesting history lesson on how both these technologies via Siri, Cortana, Alexa and Google’s driverless car etc are testament to the pace of technogical change. The future is much closer than we think.

Jim Glass:

I think speech based interfaces for IOT is inevitable. Our devices are getting smaller. We want to talk to them all of the time. It’s just so natural for people. We’ve crossed that point in our society where speech is out there and people want more of it. And I think that’s what is going to happen.

These interfaces are the future. They have to be untethered. They have to be robust to different environments, different contexts. They have to understand in larger context. Have to incorporate different modalities. Have to be multilingual. The types of things we see out there now coming out of the commercial market on smartphones and other devices is just the tip of the iceberg. Much more remains to be done. There’s lots of challenges, but the future is exciting.

And finally from John Loenard:

I want to see learning on steroids, lifelong learning where you can really think about the limit, as time goes to infinity, how does a system get better and better and learn more and more about the world?

And ultimately this entails connecting to the cloud. When one robot learns a Coke can, every robot should know what a Coke can is. This notion of sharing information, things getting logged to the cloud. I have this notion of a robot that operates autonomously each day, capturing new experiences. And then at night when it goes back and connects to charge its batteries, there’s a sort of dreaming that happens overnight, of trying to makesense of all the data of that day and connect it to the data previously acquired by itself and other robots to try to build ever richer and deeper understandings of the world.

Next week the course covers applications, specifically: Beyond IoT – Ubiquitous Sensing and Human Experience and Wireless Technologies for Indoor Localization, Smart Homes, and Smart Health

MITProfessionalX Course Diary – IOTx Internet of Things: Roadmap to a Connected World – Week 3

Week 3 of the MITProfessionalX IOTx Internet of Things: Roadmap to a Connected World course concentrated on the Technologies module, specifically:

Network Connectivity for IoT (Hari Balakrishnan)

A simplified IoT network architecture
Room/body-are networks: Bluetooth Low Energy
Extending communication range

Data Processing and Storage (Sam Madden)

Managing high rate sensor data
Processing data streams
Data consistency in an intermittently connected or disconnected environment
Identifying outliers and anomalies

Localization (Daniela Rus)

Localization algorithms
Indoor localization
Localization for mobile systems
Applications

And this is where things (no pun intended) start to get complicated. There is the complexity of IoT networking options for example. Why can’t we just use the wireless technologies that we have for the internet, our cellphones to build IoT systems? Can’t we just use cylinder networks and Wi-Fi technologies? Why do we need something new? The answers aren’t immediately obvious but when you think about it cellular networks are limited by the battery life of, for example, your mobile devices (aka gateways) and are expensive. Wi-Fi networks are limited by their range, the fundamental problem of power consumption is why cellular and Wi-Fi technologies are not applicable to a wide range of IoT scenarios.

Basically, IoT is about unusual events. Well, more specifically, data is at the core of those events. Consider applications in the space of infrastructure monitoring, like home monitoring, or monitoring pipes or other industrial equipment, or medical device monitoring. This is really about understanding when something interesting happens in these monitored devices. And the interesting thing that happens is fundamentally conveyed in data. For example, you might want to know that the temperature in your home went below some threshold, and the pipes are about to burst. Or with a medical patient, you might want to see some signal, like a brain or heart signal, that is showing some sort of anomalous value. Data starts from the sensors, it flows through the phones and base stations, and then ultimately ends up in a cloud-based infrastructure. Then there is the issue of missing and noisy data. These sensors, because they are sampling the real world, have periods of time that are not covered by the data itself. Also, the data that is coming from these sensors and these applications often has anomalies in it, things that are unusual or outliers. And so one of the real challenge is how do we detect and correct those kinds of outliers and anomalies? The classification-based method and frequent itemsets of course! Classification, for those of you like me who weren’t aware of it, is basically a way of giving outliers and anomalies a data set and dividing them into multiple classes. Frequent itemset mining basically compares the frequency of different sets of outliers to the frequency of the sets that occur in the inliers. What are the common things that occur in the outliers? In the frequent itemset mining world its about support. Support means detecting the elements that occur in one of set of data with more than some sort of frequency, ie more than two times.

However, the most complicated part is localization. Devices will have to instantaneously localize themselves. They will have to have a sense of identity and they will have to have a sense of the surrounding world. How does a device compute its position and its heading in the world? Range-based localization and bearing-based localization of course! Unfortunately, this is where trigonometry and algorithms begin to play their part. I never thought I’d be uttering the words robust quadrilateral but that’s just the rabbit hole that this course is taking me down and I have to admit I’m thoroughly enjoying it.

This week the Technologies module concludes with Security in IoT, HCI (Human Computer Interaction) in an IoT World and Robotics and Autonomous Vehicles.

MITProfessionalX Course Diary – IOTx Internet of Things: Roadmap to a Connected World – Week 1&2

‘By 2020, there will be 50 billion devices connected to the Internet.  How will you and your organization capitalize on this tremendous opportunity?

While the promise of the Internet of Things (IoT) brings many new business prospects, it also presents significant challenges ranging from technology architectural choices to security concerns.  MIT Professional Education’s new Internet of Things: Roadmap to the Connected World course offers important insights on how to overcome these challenges and thrive in this exciting space.’

I’m in the second week of the ‘MITProfessionalX: IOTx Internet of Things: Roadmap to a Connected World’ course. In the first week Sanjay Sarma introduced the Internet of Things (IoT) in a very broad sense and posed some interesting questions that are being discussed by the course participants of which there are over 500 from all corners of the World. This week the course began to detail IoT Architectures:

The Architecture of IoT (Sanjay Sarma)

  • RFID Story
  • Opportunities for IoT
  • Some interesting IoT projects
  • Architecture of IoT

The Web of Things (Tim Berners-Lee)

  • Linked data- value is greatest when linked
  • Enterprise data – shared v. public v. private
  • Importance of security, privacy and authenticity
  • Standards
  • Web of Things layer – driver for IOT systems

Lessons from the Internet (David Clark)

  • Is the Internet the right technology to hook together a network of things?
  • The key lessons that our experience with the Internet teaches us about a future of things.
  • A focus on network management, security, mobility and longevity.
  • The desirable features of a distributed architecture for a system of things.

RFID Deep Dive (Sanjay Sarma)

  • Case Study – RFID

As can you can see, MIT faculty leaders are at the forefront of the IoT space and are instructors on the course. Professors Sanjay Sarma, co-chair of the MIT Auto-ID Labs,  Sir Tim Berners-Lee, inventor of the World Wide Web and founder of the World Wide Web Consortium (W3C) and David Clark, senior research scientist, MIT Computer Science and Artificial Intelligence Laboratory gave their lectures via a pre-taped video. Synchronized video transcripts and a compiled transcript of all course lectures are available to participants as well as PDF presentation slides. The videos are not available for download. Assessments are taken to reinforce key learning concepts presented in each module, short case studies and focused readings, discussion forums for participants to address thought-provoking questions posed by MIT faculty and a community Wiki for accessing additional resources, suggested readings, and related links all add to the online learning experience. There are also social networking groups on facebook and LinkedIn. I have to admit, however, that the MIT edx platform is a little basic. Discussion threads cannot be followed by latest comment, for example, which makes them hard to follow. Following other participants isn’t possible and the training admins seem a little overwhelmed by the amount of conversations taking place. Overall though, the course has been really interesting so far.

I think Sir Tim summed the situation up very well in terms of the challenges and opportunities companies face.

‘So the Internet of things is a wave, which is coming, and it is going to be very valuable. And it will be much more valuable to the people and companies who have figured out how to use it in advance, who have made standards, who are prepared to cope with the massive diversity and different sorts of information about a given thing, companies who learned to link together things of very, very different types, and create meaningful information. Companies who’ve, from this diversity, managed to make a consistent view of what’s going on and have then designed things which will react to it in a timely fashion and produce the right actions.

The web of things is about building standards that help a company, help an organization, help a computer, step back, look at everything that’s going on, and understand it, and build these complex systems by doing integration across all that diversity, all that complexity, and all those feedback systems at different speeds. It’s exciting that in a way we have the opportunity, perhaps a little bit because of this hype cycle, while everybody in the media are talking about the hype, about how exciting it’s going to be. Meanwhile, those of us who are building serious systems have a chance to talk to our partners, pick who we’re going to discuss, and build systems which will solve these massive issues which we will be faced with of diversity, of the data under linking. So, exciting? Yes, but we have work to do.’

Next week the course moves on to technologies, specifically: Network Connectivity for IoT, Data Processing and Storage, and Localization.

Me and Mr Jones

It was a sad start to 2016 with the death of David Jones, better known of course as David Bowie. He was part of some of my most abiding memories growing up; those nights snogging with Vicky Hunter listening to ChangesOneBowie over and over again (I think our faces only parted when the record needed turning over). Watching The Man Who Fell to Earth for the first time late one Sunday night with my Dad and feeling a little uncomfortable when the naughty bits were on. Listening to David Live on my car stereo, the cassette on a constant loop for weeks. Bowie continued to be a part of my life through adolescence and into adulthood, I was in awe of the man. Much has been written about the master of reinvention in the last month, Blackstar has been a regular spotify favourite of mine and the World has been a much better place with David Bowie in it.

During each year throughout December and January public and private organizations from Government think tanks to charities, banks and silicon valley tech giants release their predictions for the year ahead. From time to time some of these predictions are insightful and and can help us plan ahead, sometimes not. David Bowie was an extraordinary man but predicting the power of the internet in this 1999 video was just genius. Goodbye Mr Jones, you’ll be with me always.

Celebrating a Digital Birthday

On march 18, 2009 I created my Twitter account. I was skeptical in the beginning, hence my late adoption, but thankfully I persevered and it has since become an invaluable tool in my life, especially work. Last week was also twitter’s ninth birthday. There have been many key events that have catapulted Twitter onto the world stage, SXSW in 2007, Janis Krums photo of passengers huddled on the wing of a US Airways plane – only moments after the aircraft plunged into the Hudson River. But, as @Jack pointed out in his birthday message, it has been journalists that have been behind much of its success. Twitter has revolutionized global news delivery and consumption. Twitters ability to share events in real-time is unprecedented in human history and has caused governments, corporations and individuals to change their strategy in one way or another due to its powerful ability to change events in real-time. Twitter is testament to our ability to affect change in a world where we seem powerless in the face of speed and complexity. Long live Twitter.

However, is Twitter going to be a victim of its own success? Some of the policies that have been in place since its inception may well need to be revisited, the follower/following ratio for example; is this too restricting after a successful nine years and the amount of people using the platform? Some steps have been taken to protect people from being targeted by cyber bullies which is very welcome. The decision to limit Meercat’s access to the platform? That debate is still going on in the blogosphere.

One of Twitters big business problems seems to be how to attract greater numbers of people to join and, once joined, how to keep them on Twitter. There are no easy solutions to this problem, however, newcomers are faced with what seems at first to be a complex and endless stream of a data which is difficult to navigate. If newcomers could be gently introduced to Twitter by using data analytics from a persons online footprint, their interests, geographic and demographic data for example, a newcomer follow/following ratio and some monitoring, maybe newcomers would have a better experience when joining and Twitter would get more of them to persevere and stay. A simple questionnaire, no more than five questions, when signing up to Twitter and creating a profile could be the starting point. Once a profile has been monitored for a while, suggestions on who to follow and tweets of interest could be recommended individually. Customer experience is vital in our instant gratification world, newcomers to Twitter are having a difficult and complex experience.

There are also those people and businesses that have been on Twitter for a while but are getting frustrated for one reason or another, some of whom leave. Again, instant gratification rules. One way of attempting to beat the system, as it were, is to buy followers (a definite no-no) or sign up to one of the many services available that unfollows people automatically (not a path I’ve taken). However, trial and error, perseverance and a healthy dose of curiosity work wonders IMHO. Attempting to make shortcuts defeats the objective, it’s much smarter to find your own way of using Twitter. Larry King phones a voice-mail number and his assistant transcribes it into a tweet. Smart.

Twitter is also forming partnerships. IBM is exploiting large troves of data for its business customers via its big-data fire-hose with 6,000 tweets a second, more than half a billion each day. Twitter data is being mined for how much your tweets reveal about you and much, much more.

The future is bright for Twitter, I’ll be continuing my social media journey there, I hope you will too.

@nigelbarron

Rekindling our love affair with print.

I’ve just re-read Wolf Hall by Hilary Mantel, it really is a superb book. Over the holidays I read the first in the J C Sansom series of Shardlake books, Dissolution, prompting the Wolf Hall re-read. Around July 2014 we banned internet access in the bedroom, no phones, no iPads, nothing, only print. The change has been like discovering a long lost friend and realising why you became friends in the first place, something chemical. My book reading has been on a steady decline for the past four years or so, so many distractions, everything is vying for my attention. Mark Zuckerburg has started to recommend books via his stable of social media companies Facebook, WhatsApp, Messenger and Instagram. Not surprisingly the books he chooses race up Amazon’s best seller list, is this the start of a renaissance for books? The internet, and in particular social media, has helped us put down our novels, biographies, history and text books and pick up our smartphones and tablets with worrying regularity. Foyles and Waterstones in the UK reaped the rewards of the print resurgence over the holidays as online growth slowed, one book retailer remarked that kindle sales had ‘fallen off a cliff’.

The pace of technology innovation might be switching people off, Amazon had a bad year in 2014 as far as profits and the stock price are concerned, 2015 may be difficult as well especially given our rekindled love affair with print.

Reinvent

It’s time to reinvent myself again. A good friend of mine, technically quite brilliant, told me recently that even she is finding it difficult to keep up with the pace of technology. She comes from a windows background, some of the people her company are hiring haven’t seen a DOS prompt before. She’s decided to go for a management position within the next two years, a product manager role maybe. It’s not as if I haven’t had to reinvent myself in the past, it’s been an unconventional path I’ve had to take to get here, the road less travelled, so to speak.

School wasn’t so much about education but rather a course in survival. Luckily I was good at sports so I avoided the worst of the bullying, just the odd fight now and again. Leaving school without a high school diploma at the beginning of Milton Friedman’s economic experiment via Ronald Reagan and Margaret Thatcher would prove to be challenging. After working in bars, driving vans and at one point working in Antibes in the south of France varnishing yachts, my Dad helped me get a job at the company where he worked, manufacturing hydraulic excavators, cement mixers and dumper trucks. I started in the paint shop, preparing all these machines ready for their paint job. The production manager saw something in me and gave me a job on the production line in the goods inward department. I got my fork lift licence and within a few years I became the foreman.

Unwittingly, the job prepared me for my next reinvention. By the end of my time in the UK’s dying manufacturing sector I had developed some transferable skills. The production line I worked on turned out about 20 machines a week. Each machine had thousands of parts, hydraulic rams, engines, steel tubes, nuts, bolts and washers, all of which had a seven figure part number and a designated storage place for easy access. The men on the line could show me a part and I’d be able to tell them the part number and where it was stored. I also knew at what point in the production cycle each part was needed and who needed them, with over a thousand men working on the line, I knew them all by their first names, communication was key.

By the time trickle down economics was beginning one of its many recessions and the Japanese had begun to master the Toyota way and just-in-time production methods, someone saw something in me again and said ‘you should go to University’. I’d always enjoyed reading and been curious but the last time I’d studied anything was back in high school, by then it was obvious there was no future in manufacturing in the UK and I knew I had to get an education. I enrolled on a two year Higher National Diploma course in business studies, majoring in marketing. I was a house owner at the time so I chose a college close to home so I could keep an eye on the place and make sure the tenants were looking after it. The course had a Law module that I really enjoyed and I came top of my class. When I graduated the country was coming to the end of the recession but I didn’t feel as though I had learned enough so I enrolled in the LLB Law degree course for another three years.

Those three years would be some of the toughest years of my life. Without the skills I’d developed in my previous incarnation I don’t think I’d have got through. I had to learn a new set of skills too, listening, writing, critical thinking, researching, analysis and a lot of perseverance. However, it was my introduction to technology and desktop computers that prepared me for my next reinvention after I graduated. Law school was not an option from a financial perspective and my interest in technology had emerged. I was, to all intents a purposes, broke. I sold my house to pay off student loans and found a job with a national logistics company managing their distribution systems on a client site. This experience would influence my decision to move south, attend a boot camp for a month learning DOS and how to dismantle and reconstruct desktop computers.

Shortly afterwards I started working for what would become a few years later Fox IT. The company was formed by merging a helpdesk/desktop support company and a IT Service Management company, Ultracomp. Ultracomp was steeped in ITIL history and could boast authors of the official ITIL publications including ITIL v2 service support and service delivery and some of the chapters of the service management lifecycle in ITIL v3. It quickly became apparent that the ITIL maxim ‘IT is the business and the business is IT’, was well ahead of its time and I began the route to ITIL Certification that culminated in 2010 by becoming ITIL Expert certified.

Fox IT was not immune to the effects of the dot.com crash and in 2002 like so many tech companies, redundancies were in the air. I volunteered. It was time for more reinvention and an adventure, I bought a plane ticket to Quito, Ecuador and began a process that would not only change my life but also my whole world view, priorities and relationships. In return for Spanish lessons I taught English and volunteered with medecins du monde helping the indigenous Ecuadorians construct a dairy and helped them introduce tourist treks to their village. The whole one year experience was mind blowing and I decided that rather than return to the UK I’d take the Spanish I had picked up and see if I could find some work in Madrid as I was rapidly running out of money.

Two months later I started working for CSC In Asturias, a province on the north coast of the Spanish peninsular and I’ve been here ever since. I’ve been working remotely since 2005 with colleagues from all over the world, I’ve travelled to India and Central America and visit the UK to see family and friends, they also come and visit me. It’s been a roller coaster ride, many other things have happened of course, some good, some bad, my Dad died of prostate cancer, I turned 50.

There have been a few constants on this journey, the importance of communication for example. I’ve adapted and welcomed change rather than feared it. What’s next?