Access control and security management are and will continue to be driven by data.
The core function of access control involves the movement of large amounts of data. The company that moves data best will enjoy the most successful security management function.
The future of access control is the efficient movement of data: from the credential to the reading device, reading device to door controller, door controller to main personal computer (PC), main PC to external database and database back to the workstation.
Since software and its application have are the heart of security, the logical emphasis is on open architecture to facilitate moving data.
Why must data move to different locations in an access control system? Start out at the far end of the system, the end used by the person trying to enter the facility. Credentials today range from a card swiped or presented to sophisticated biometrics, where the identifying factors reside in a fingerprint, iris or other personal identifier.
Whatever form the credential takes, the effectiveness of the system relates to transport mechanism and reader technology. Moving data best means efficiently and effectively transporting information from the reader to the next piece of equipment in the chain, typically a door controller of some kind.
The information to be transported is usually fairly simple.
With a cardholder, it’s often nothing more than a number. But where biometrics is in place, the process may involve local manipulation of much more data contained in the biometrics-centric template. And so the ability of the system to handle that data is critical at the reader level.
The data is transported to the controller, and in a well-designed system much of the decision-making process occurs at that controller, through distributed processing. Clearly, a system that requires moving data in its entirety to a PC, where a hard drive can be searched for a match before the door can be opened, is too unwieldy, inefficient and problematical in its results.
Controller Knowledge
Much of the responsibility for the open/no open decision is often given to the controller. How does the controller know how to make the decision? It gets that information from the computer periodically in a download.
When people are enrolled in the system, data about them downloads through the transport mechanism to the controller where it’s stored. Then, when the credential is presented, just the data related to the credential moves over the transport mechanism to the controller, which proceeds to make the lock/unlock decision.
The transport of data from the credential reader to the controller is essential. The controller doesn’t know anything unless the data has been moved to it from the central database.
Efficient Movement
When facilities have a lot of records, the initial download from the database to the controllers can take some time. More and more, the computer/controller connection is made over a company’s local or wide area network (LAN/WAN).
But with multiple controllers spread around a facility, inefficient data packaging and transport in the mass download from database to controller use excessive bandwidth and can bring the network to its knees. Now efficient transport of data from the computer down to the controllers seems to be the critical factor.
Industry-standard protocols (such as TCP/IP) are used at the transport mechanism to wrap around the data for transport over the network. But the data that’s inside the wrapper is proprietary from one system to the next. The better designers do a better job of efficiently presenting data inside the TCP/IP wrapper.
Large facilities often employ another area of data movement: interface with other systems. A large college doesn’t ask its students to register at both the registrar’s office and with security. The idea is for everyone to register at one place, with the data moving to where it’s needed – including availability to the access control system.
So once again the data has to move efficiently. Failure to accomplish this can mean using too much bandwidth, overloading the system and potentially crashing the network. This is another place where transport of data is critical.
Most companies employing large access control systems wish to use the systems at multiple locations through workstations. Since the complete database can’t be duplicated at all those locations, there must be an efficient way to pull data from the database to the workstation, moving it across the network.
Beyond Access Control
Less sophisticated design of a workstation brings the material to the workstation where a local search can be made. Sophisticated design is more selective using a browser-based thin client workstation rather than the more traditional thick client workstation.
Good access control and security must easily interface with facility management, human resources and all the other systems integral to the large organization or enterprise environment. In fact, ease of integration with other systems is the benchmark of the data-driven system.
The optimum architecture for such a security management system will be object-oriented, component-modular and fully scalable. It can begin with stand-alone applications and work upward in building block fashion through server-based to full-enterprise applications with virtually limitless expandability.