EyeSpyFX Commercial Director

JMcK

We are delighted to welcome John McKenna to EyeSpyFX

John will take the role of Commercial Director with particular focus on devloping sales of our BT Smart Locks and IPIO range of products and services.

John has extensive commercial experience in power generation and construction sectors and we are excited he is joining us.

 

 

CCTV Tower Units and IPIO

Tower300
CCTV Tower Units are an ideal solution for temporary locations such as building sites. IPIO can help to resolve problems commonly found when Tower Units are professionally monitored by an ARC and also improve end user experience.

Arm/Disarm Video Monitoring
The IPIO system is ideal for Arming and Disarming AI Video Monitoring systems. Most popular CCTV systems are compatible with IPIO including Hikvision, Dahua, Davantis, Avigilon, Camect.
armedCCTV

Better than a schedule
Sites that have Tower Units on them are often temporary locations. A regular schedule can never be established – there is no pattern. IPIO allows users to arm and disarm as they wish using a mobile app. The app gives the site users total control, flexibility and freedom while preserving security and privacy – so much better than a schedule! When the site users have the freedom to arm and disarm when it is needed that helps to reduce false positive identifications of friendly people. This in turn reduces the number of event alert phone calls made from the ARC to the Site users.

Easy management of users
Tower Units often monitor building sites. They are inherently dynamic places. As the project progresses the persons who arm/disarm the site change. The IPIO app easily manages multiple and changing users. New users can be added and deleted from the site with ease.
usermgt

Logs
Every event that occurs on the IPIO unit is logged, including arm/disarm events and by who. The logs appear on the Site Managers IPIO app. They are thoughtfully designed so to maximise legibility and ease of use.
IPIOLogs

In App Notifications
App notifications alert Managers of selected events. Managers can easily choose not to receive events if for example they are not working at the weekend.notifications473

Audio alerts
In the event of intruders monitoring centre staff can use IPIO to play an audio warning or switch on lights.
audiolightscctvhatch

Introducing IPIO CC

IPIO is a physical device which to date has been a wired I/O system that arms/disarms CCTV over wired I/O connections. IPIO is often used to control and monitor doors, gates, alarm systems or any other device that is part of the security system.

EyeSpyFX is pleased to introduce IPIO CC

IPIO CC (cloud control) is a virtualised IPIO unit. It is hosted in the cloud, there is no physical I/O component. CC can be used to arm/disarm Camect CCTV Analytics hubs and monitor SIGFOX devices. IPIO CC has the advantage of not needing any I/O wiring as all connections to controlled and monitored devices are managed via software integrations.

IPIO CC units behave the same way as regular IPIO units on the IPIO mobile app and the IPIO web portal

IPIO CC is an ideal way to associate a SIGFOX device with the IPIO app giving the user the advantages of no power, no SIM card monitoring and a powerful mobile reporting app. IPIO CC is idea for gates to wind farms, harbour and maritime settings, building site equipment monitoring, warehousing, tower unit enclosures and many other applications.

Enquires: info@eyespyfx.com

SIGFOX and Camect devices monitored and controlled by IPIO app

SIGFOX and Camect devices monitored and controlled by IPIO app

Remote gate sensors

EyeSpyFX are proud to announce integration with IOTA door/gate sensors and IPIO mobile app. IOTA door/gate sensors and IPIO are ideal for:

  • gates on remote building sites
  • farm buildings
  • the access hatch to CCTV tower units
  • equipment rooms in harbours, boat yards and other remote locations
  • boat hatches

The IOTA door/gate sensor is a robust device suitable for outdoor deployment in remote locations. It uses the SIGFOX network. Sigfox is a ZeroG mobile network that transmits very small pieces of data. The sensor has its own power supply and network connectivity. Open/close events are logged on the IPIO mobile app and a notification is sent to the user. Battery status, signal strength and postion state are displayed in the App input tile.

  • Robust build quality
  • Excellent UK and Ireland Sigfox network coverage
  • 10 year battery life
  • Event logs shown in IPIO app
  • Event notifications sent to app
  • Battery and Signal strength indicators in app
  • Event rules can trigger IPIO actions such as lights on or warning siren
  • No more running wires to difficult to reach door and gates
boathouse2

IOTA remote sensor and IPIO: ideal for locations with no power or internet

Gate Sensor

Robust Gate Sensor

ipioapp

IPIO app showing Sensor tile and door and CCTV controls

IPIO app showing log page

IPIO app showing log page

Sigfox coverage

Sigfox coverage in UK and Ireland

Contact info@eyespyfx.com for further information.

 

New IPIO rules feature

IPIOrulesIPIO has a new programmable rules feature. (IPIO app v3.03 onwards)

Using the rules feature the app can be programmed to carry out an action based on an input event. For example: If the gate opens (input 1) then disarm alarm (output 1) and switch on lights (output 2).

The rules feature is ideal for building up sophisticated I/O behaviours on monitored sites.

The IPIO rules feature does not appear to all users. It is a powerful feature and only appears to users who have Portal Access. This enables site managers working in the ARC to create rules while not burdening end users with user interface elements that they will not use.

IPIO rules are based on “if that then this” type logic. If input 1 is triggered, then output 2 will arm.

There is a rules simulator feature that visually models the rule. Outputs can be matched, flipped or oppose an Input state.

There is a second section to rules (not shown above) that allows multiple outputs to follow each other. For example, if Output 2 is armed, then arm Output 3.

We would be pleased to hear about your applications based on rules, and we look forward to making the rules section even more powerful and responsive and intelligent in due course.

Navihedrons and Roy Stringer

I met Roy Stringer at an academic conference in Glasgow School of Art in ~1998. He was there talking about his Navihedron concept (It was a co-incidental meeting, I was at the conference speaking about a different subject). Something about Roy Stringer and his Navihedron idea made me remember it all this time.

Stringer’s idea was that an icosahedron (12 vertices’s, 20 edges, 30 surfaces) could be effectively used as a navigation system to present non linear content. Each point of the icosahedron would be a subject heading. Each point can be clicked. Two things happen when a point is clicked; 1. The Navihedron animates bringing the clicked point to the frontal position, 2. content relevant to the clicked point subject heading appears in a frame or content area located adjacent to the Navihedron.

It is a very pleasing effect. The animation is engaging. The viewer can revolve the Navihedron exploring the 12 points. Each point has five geometrically related points that offer the next step in the story. The viewer can create their own path through the story. It allows users to browse in a natural way and yet remain within a cohesive story. In comparison linear presentations seem rather boring. The key to it may be that the reader can select their own entry point rather than the start of a linear story. They can click on the point they are interested in. That point is the start to their story. Once they have selected their start they can control their own story rather than being directed through numbered linear pages.

When Stringer presented the Navihedron in 1998 a user could sign on to Navihedra.com and build their own Navihedron and edit the headings. The idea was that a finished Navihedron could be exported and then used in a stand alone website.

Stringer used a physical model of the Navihedron using plastic tubes held together with elastic. Any subject could be explored using the model as a tool and prompt. I have made a few of these over the years and they are a great three dimensional connection between information and space.
navi2
Roy Stringer showing opposite points. Image credit: https://katiesvlog.blogs.com/vlog/2006/03/ted_nelson_at_t.html

Stringer felt that the Navihedron might be useful to enable students to put together a story about a subject they were studying. As an example of this I made a sketch showing a Navihedron in the context of a school history class about the Vietnam war. It is not a linear presentation of history – you can enter from any point of interest.
vietnamnavihedron
Authors sketch of a Non Linear history lesson style Navihedron

The Navihedron idea has had a wide reach. For example Andy Patrick formerly of Adjacency recalls Stringers influence in the Design of the early years of dot.com ecommerce sites including the Apple store.

Stringer was inspired by Ted Nelson who in 1965 first used the term “hypertext” to describe:
“nonsequential text, in which a reader was not constrained to read in any particular order, but could follow links and delve into the original document from a short quotation”.
Nelson says that HTML is what they were trying to prevent in his project XANADU. “HTML has “ever-breaking links, links going outward only, quotes you can’t follow to their origins, no version management, no rights management”.

Here is the brilliant, thoughtful, Ted Nelson talking about the structure of computing, ZANADU and Transclusion.

(En passant: don’t miss these great Ted Nelson one liners)

Roy Stringer sadly died very young in 2000. Although his work had huge impact, somehow, Stringers Navihedron Site has disappeared from the web, actually it has fallen victim to what Nelson quips as a world of “ever-breaking links”. In fact I could not find any working Navihedron.

navihedra-com
Image Credit: The front page for access to Navihedra.com from Alan Levine‘s memories of Roy Stringer: https://cogdogblog.com/

Perhaps there is something in Stringers Navihedron work that has been superseded by the all pervading paradigm of web based navigation. The menu at the top, sub menu down the side or dropping down, style website (this one included) has become dominant.

Daniel Brown of Amaze (the company Stringer founded) helped to develop the Navihedra site and he points out that Stringer’s original Navihedron was built using Shockwave. That technology has now been largely replaced by CSS and Java Script.

In EyeSpyFX we set out to see if we could rebuild a Navihedron using modern web friendly technologies. The Navihedron you see here is a cube and not an icosahedron as Stringer originally proposed – but the storytelling concept is essentially the same. We built it driven by curiosity really – just to see if we could do it. It is not quete performing as we intend but I think it is a good start (controllable and animated if viewing on a PC (browser dependent), just animated if viewing on Mobile). We would like to build more and in time develop it into a fully editable system for creating Navihedrons – we are searching for the commercial reason to push it forward. The Navihedron concept is intriguing – perhaps a lost web treasure. Fully implementing a Navihedron using web technologies is surprisingly difficult– it is going to be a long term project. This blog article and demo is just a step in the general direction.

App Life

Some iOS apps are now in their 10th year. In the EyeSpyFX app portfolio we haven’t got any that old – but we have some 7, 8 and 9 year olds. One of our apps, “My Webcam” would have been over ten years old but we retired it two years ago. In fact that app would be in its 16th year had it survived. Before its iOS manifestation “My Webcam” had a life on Blackberry and before that as a Java app on Nokia and Sony Ericsson.

Apps get discontinued for lots of reasons, for example:

  • Removed by the app store due to lack of updates
  • Competition from other apps
  • The app store closes
  • The original reason for the app does not exist anymore
  • No sustainable commercial rationale
  • The app function is now an OS function included in standard software
  • App is so bloated it is difficult and expensive to update
  • App was built using a previous development environment and re-factoring it is not worth the cost
  • The app gets subdivided into smaller apps

Memory of “My Webcam” prompts me to reflect on the life cycle of apps in a general sense. I wonder if you could go to the app store and do a filtered search by date of publication what the average app life would be. In ten years time will there be apps that are 20 years old?

Our experience as developers is that as apps grow old they also grow bigger and then even bigger as more time passes and features get added.

There are some practical limits to app growth. Ultimately an app has to fit on a phone shaped screen and there is a limit to how many buttons you can fit in. If you keep adding functionality and features to an app year after year it inevitably becomes bloated. The bloated app – perhaps now resembling something more like “enterprise software” departs from the very concept of an app: “a small neatly packaged part of the internet”.

So why do apps grow? Top reasons include:

  • We don’t want to have lots of apps – we just want one app so our customers find it easy to choose
  • The PC interface does this – so the app should do it as well
  • The UI on the product does it – so the app should do it as well
  • The user base is growing and we need to give the users new stuff
  • Some of our customers are specialised/power users and we need to support them.

These are good corporate reasons but they strain the app design and tends to forget about the plain old user who wants the app to do the one thing the app name/icon suggests.

Serving everybody also does a disservice to the specialised power user. They come to the app with their special requirement but find their special feature located down the back in an app whose whole design and evolution serves a more general use case.

Rethinking a mature app into separate apps enables the company to specifically serve user requirements, for example; to open the door, to view the camera, check the logs, to disarm the system, to view the recording from 2 weeks ago. It is of course tempting from a corporate point of view to keep all of these functions together in a super security app. However each function has a specific user group in mind. A suite of mini apps could be created with the name of each app reflecting the function or user role.

Subdividing mature multifunctional apps into generation 2 apps can help with making an app easy to understand and use again. The really difficult question is, when is the right time to stop developing and start subdividing?

The point of subdivision can arrive simply because of a corporate internal practical reason, being too costly to maintain for example. A more fruitful sort of subdivision can also occur as a result of a design review – led by users – and to give a new lease of apps – life.

applife

Single and Stationary vs Multiple and Mobile

Single and Stationary vs Multiple and Mobile
In many security applications the PC client is seen as the primary interface to the system. Use of the PC client is a dedicated task normally carried out by a singular person checking for a specific item of interest.

When a mobile app is introduced – assuming that the app is easy and effective to use – the number of users tends to go up. The frequency and type of use also tends to increase.

If people don’t need to log on to a PC and can instead check a mobile app then they tend to check in more frequently. Also, more people check in. One person with the app says to the next to get the app and that it is easy and the user numbers grow. The sort of use tends to diversify. People find different reasons to check in. For some, the reason is security, the same as it ever was, others may use the logs to check to see who is in or out at present (staff levels). Others may check for crowds on the shop floor, others to see if the delivery lorry has been dispatched yet (workflow). Yes, some might use the system to see if there is a queue in the staff canteen.

Once the security system is made accessible in the form of a mobile app people find the data contained within useful for lots of different reasons. It is therefore generally true (certainly for security apps and maybe for other domains also) that users tend to be single and stationary or multiple and mobile.

EyeSpyFX user data suggests that the ratio is 1:3. Of course this will vary from application to application and installation to installation.

Android wish list for 2017

Android 1.0. HTC Dream - 2008

Android 1.0. HTC Dream – 2008

When the first Android phones were launched it was unclear (to me at least1) how the ideas of “search” and “mobile phone” would come together. (crazy, I know!)

Fast forward to 2017, voice command and search integration with a security camera app might, soon, allow a user to say the commands:
“Go to camera 34,
go back an hour,
go forward 5 minutes,
go back 1 minute,
zoom in,
pan left,
jump to live,
switch to Front Gate camera”.

The voice commands would control an app which would chromecast to a big screen.

This vision is not exceptionally fanciful as many security camera apps can do all of the above today – except using a visual touch UI.

Voice commands and search are closely connected. A voice command is inherently vague. Search is a key computational mechanism used to interpret a voice command and find a best-fit reply.

There are just two barriers holding back the vision as outlined above: 1) in app search and 2) custom voice commands.
1) In app search is available only in a very limited sense at present. You can have Google index the app manifest. App functions then show up when you do a relevant search. This however does nothing to help search the user generated content within an app.
Google have tried search of data held on private computers before. In 2004 Google launched a PC application called Desktop. Google Desktop indexed all data on your PC. The project was closed in 2011 because Google “switched focus to cloud based content storage”.
2) Requests for custom voice actions from third party app developers are currently closed. (also the case for SIRI btw)

Custom voice commands - not yet (Dec 2016)

Custom voice commands – not yet (Dec 2016)

With both in app search and custom voice actions not being available it seems like the vision for fully integrated voice control of apps is not viable – for now.

If OK Google and SIRI continue to grow in popularity will the pressure for custom voice commands also be the catalyst for enabling in app search?

Voice actions and in app search could be (more easily?) achieved if you move the location of apps from the phone to a google/apple account in the cloud. An added advantage of apps in the cloud is that we could log on from anywhere and use custom apps.

Choose Google or Apple

Choose Google or Apple

With thanks to uber, maps, cheating in pub quizzes and countless other uses it is now clear that search and phones are a perfect match. It seems (to me at least2) that the next wave of development for search and phones will involve voice commands. Voice command based interfaces also seem to fit well with wearables and control of IoT devices.

To conclude, a seasonal wish list for 2017:

  • In app search for user generated data
  • Custom voice commands made accessible to third party app developers
  • Move the concept of apps away from the phone and onto a Google account. No more downloading.

EyeSpyFX introduce a new library for reading H264 Video.

For Network Camera and VMS Manufacturers who need to build a Mobile Solution SFX100 is a library of code that enables iOS and Android apps to be built that decode and display MJPEG, H264 video using RTSP over TCP, RTSP over HTTP and RTSP over HTTPS.

Unlike bulky Open Source projects such as ffMPEG, Live555 and VLC, published under GPL or LGPL, SFX100 is a proprietary library available under licence that is ready for immediate and efficient deployment in commercial mobile projects.

SFX100 is optimised for Security Camera Video applications uniquely offering a secure layer for streaming RTSP tunneled over HTTPS.

SFX100 is exemplified in EyeSpyFX premier iOS mobile app “Doorcam”. (https://itunes.apple.com/gb/app/doorcam/id1060661561?mt=8)

Key features include:

  • Secure layer for streaming RTSP tunneled over HTTPS.
  • Per project commercial licence
  • Optimised code for security camera video types
  • iOS and Android libraries available
  • Reads RTSP streams and provides mechanism to pass to phone based native decoders
  • Compatible with IPv6

Contact us on info@eyespyfx.com for further information about how SFX100 can be deployed in mobile apps.