Tuesday, August 11, 2015

5 Reasons Enterprises Have Difficulty Implementing New Technologies





By Jonathan Reichental, Ph.D.

Chief Information Officer
City of Palo Alto







Technology innovation abounds! We live in spectacular times. Change is happening rapidly and in unexpected ways. Market barriers for innovation have been lowered. Got an idea? You can make it happen.

But despite all the ebullience, much of our innovation still remains incremental. It’s more often evolutionary rather than revolutionary. In fact, that’s just the way it’s always been. New knowledge is created at its natural pace and new insights build upon it. Occasionally there is a ground shift and a new branch of knowledge emerges that itself spawns new products and services. In the information technology (IT) business, we see this every few years.

Sure, we should give credit where it’s due. The IT industry is more often at the leading edge of innovation when compared with other industries.

I write here, not about the IT innovation that we see happening in businesses every day and not about the important incremental innovation that helps businesses move forward, I’m referring to breakthrough innovation — the kind of innovation that reinvents everyday things. Of course it happens eventually, but it takes a long time.

The reality is that organizations are only capable and willing to adopt technology at their pace. It can be valuable to understand why this might be the case. No matter how much you fight it, the appetite for technology adoption at a given enterprise is a throttle on the velocity of new innovation.

If this were not the case, I imagine we would already have had teleportation and invisibility cloaks at our disposal.

Over my career working in and observing multiple enterprises, I’ve noted some consistent traits that provide rationale for their speed of technology adoption. It’s fair to say that there is a spread, but the majority in the bell curve move at a slow rate. Of course, there are always clear exceptions and we have to recognize those trailblazers too. However, even the first movers are constrained by the majority. The majority dictates the market.

Below I briefly discuss five reasons I believe enterprises continue to have a slow adoption rate for innovative technology. I’ll admit there are no surprises in this list. However, I think they are worth calling attention to, particularly since we are in an impressive period of IT innovation. I’ve also added some thoughts on how they could be addressed.
 

1. COST

Decision-makers have many choices when investing scarce dollars on IT projects. In most organizations it’s a prioritization process that nobody enjoys. But it’s essential. Many great ideas fall by the wayside and never make the light of day in favor of more pressing enterprise needs. In this context, broad implementation of new technology — not research and development efforts — can have real problems securing funds.

Additionally, a new solution is often more expensive because of the change that needs to happen. It’s a bigger proposition than an upgrade, an enhancement, or the roll-out of a commodity-type ERP. Other costs, such as risk and the implementation unknowns, can provide a disincentive to decision-makers already jaded by too many failed IT projects.

ADVICE: Despite these constraints, many enlightened organizations still commit funds to high-risk, new technology projects, often by using dollars set aside specifically for these special projects. Decide if all projects should go through a standard IT governance or determine whether there should be an exception process that is triggered by technology that meets certain criteria such as high risk and uncertainty.
 

2. COMPLEXITY

Today, fewer and fewer solutions remain islands among the IT infrastructure. There are often so many inter-dependencies that even a small change has downstream impacts that must be considered. Introducing new technology into these environments is seldom a trivial exercise. It’s also a reason why so many decision-makers prefer single-vendor stacks. Sure, standards have improved the situation immensely, but we’re still a far distance from a place where customization isn’t required.

ADVICE: In many ways, this limitation is aligned closely with cost. Complexity becomes less of an issue if you’re prepared to invest in the effort. If possible, put some funds exclusively for this exploratory work in your budget. Think about investing in a lab environment where ideas can be safely explored. Prototypes are a great way to win decision-makers over.
 

3. RESISTANCE

While both cost and complexity are largely quantitative metrics, there are a number of human factors that greatly influence IT decisions.

There is an unfortunate twist to our period of hyper-innovation. While we embrace and support it — we love new toys! — there’s a more sober component to new technology introduction that cannot be overlooked. It’s similar to that moment at a buffet when you know you’d like to try more, but you’re simply too full. Humans have a cap on the amount of new technology they are able to consume. Introduce too many new solutions and functions and they will be rejected.

This applies to system improvements too. Make too many far-reaching modifications and you risk a user rebellion. That’s a recipe for failure.

ADVICE: Leaders need to evaluate for their organization the pace at which new capabilities can be deployed. It’s probably a lot slower than we all think. Spend time to discuss different views with a variety of stakeholders. Analyze historical trends. Monitor usage as products get deployed. Over time it will become clear what the tipping point is. Your users will quickly let you know.

4. LEGACY
Our desire for change is often at odds with our need for things to remain the same. It has a lot to do with comfort and trust. We often like the things we know more than things that are new and unknown. There’s a reason we go back to that tried and tested Excel formula when we know we have the same capability in our latest ERP system. There’s a reason we continue to use email for seeking answers when our organizations have spent millions on elaborate knowledge management systems. There’s predictable value in legacy systems.

New technology often has to compete with these older solutions. For many of your users, you’ll need to pry them away from old applications kicking and screaming. In some instances resistance will be so fierce, you’ll be forced to concede.

As a consequence, legacy systems can present a limitation to the introduction of new technology. It may not happen at the time of deployment. It’s just as likely to happen at IT governance when the decisions are being made on what projects to invest in. A debate may ensue that argues in favor of the legacy solution and that will kill the new technology before it ever sees the light of day.

ADVICE: Really focus on the business case. Communicate it loud and clear. Make sure you have air-tight evidence for the return-on-investment (but recognize the need for a small number of leading-edge projects to move forward without all the evidence). Numbers talk, particularly dollars. Championing should come from many different leaders. Make a strong case, but ultimately respect the organizations choice.

5. POLITICS

Oh the joy of organizational politics! It should come as no surprise that politics plays an important role in IT decision-making. Sometimes it can be an asset. For example, escalating up a hierarchy to leverage leadership perspective can often be a good way of getting tough decisions made. But it can too often be a liability. For example, individual or team self-interest can result in vendor selections that don’t reflect evidence gained in requirements gathering.

Reconciling organizational and individual interests is a messy business. I imagine many of us can tell our own stories of how we observed decisions being made that had little basis in reasonable logic. We’d like to pretend it isn’t a factor, but all too often it is.

Negative organizational politics can hinder IT innovation. There is considerable value to the skill in those that can navigate within those constraints and turn them into a positive outcome.

ADVICE: Organizational politics shouldn’t be viewed as always being negative. It’s important to recognize the role it plays in the process of introducing new technology and then work to channel it into a positive force. Find out who your allies are and partner with them to help make a business case. Observe, listen and learn about your organizations dynamics. Make note of what works and what doesn’t and leverage that knowledge to navigate through the organizations politics. It’s not easy at all and an excellent skill for those that can master it.

Conclusion

These organizational constraints are presented not to suggest that new technology seldom gets introduced to the enterprise. Of course it does. The real effect of these limitations is that they slow the rate of introduction at a time when organizations face real consequences if they don’t innovate fast enough.

Recognizing and admitting that your organization has difficulty implementing new technology is the first step to fixing the problem. It could be one or more of these five reasons above or perhaps it’s something else. Most importantly, find out what it is and bake the fixes into your enterprise strategy. Your organization will be glad you did.


Footnote

Just in case you were wondering, invisibility cloak research and prototyping is well underway. Just use your favorite search engine to look up the subject. You might be as surprised as I was.

Monday, August 10, 2015

Redefining UC Mobility: There Are Many Ways to Be Mobile

A Frost & Sullivan White Paper

By Robert Arnold
Principal Analyst, Information and Communication Technologies
Frost & Sullivan

INTRODUCTION

Mobility is not just hype; it is the new way work gets done. Accelerated technology developments have enabled greater user flexibility through ubiquitous access to robust business communications capabilities that make employees more reachable, responsive, and productive.

Pervasive and familiar to everyone, smart phones and tablets dominate the business mobility conversation. These devices have become synonymous with mobility in the business communications market. Frost & Sullivan research forecasts the smart phone market to grow at a compound annual growth rate (CAGR) of 13.8% from 2014 through 2020. Despite this, we believe that limiting the business mobility  discussion to smart phones and tablets is a nar row view.

There are many ways employees can be mobile. This article discusses key considerations in developing a comprehensive mobile business communications strategy that addresses the requirements of different users and outfits them with the right set of tools to mor e effectively complete their business tasks.


REDEFINING UC MOBILITY

Forward-thinking businesses are providing mobile workers with access to unified communications (UC) features, such as voice, instant messaging (IM) chat/presence, messaging, and conferencing, in order to improve employee productivity and business agility. However, the full benefits of mobile UC can only be attained when employees are matched to appropriate devices and applications that address individual user environments, workloads and preferences.

While smart phones and tablets provide numerous benefits, such as rich functionality and a convenient form factor, many professionals prefer the ergonomics and purpose-built performance of other endpoints and interfaces, such as desk phones, wireless handsets, and PCs/laptops running soft clients. Because of this, it isn’t just smart devices that are seeing healthy shipment volumes.

Specific to the business communications market, IP desk phones, voice over wireless local area network (VoWLAN) phones, digital enhanced cordless telephony (DECT) sets and desktop/PC soft clients are all expected to experience stable unit shipment volumes through the year 2021, per Frost & Sullivan research. The relative health of this range of options speaks to the UC endpoint diversity that today’s mobile workforce requires. For example, research and consulting firm Global Workforce Analytics estimates that one in five workers in the United States have an arrangement to work from home at least some of the time.

It would be unreasonable to equip most home-based workers, even occasional ones, with only a smart phone and expect them to adequately do their job.

ADDRESSING DIVERSE MOBILE WORK STYLES

Overall, effective UC implementations must support all mobile user needs, whether employees move
throughout office buildings, change desks or offices daily, run from meeting to meeting, travel often, or work from a client location or from home.

Evaluating the needs of different mobile worker types is simplified when they are categorized. From a high level, there are essentially three mobile UC user categories, including employees that are internally mobile, occasionally mobile or frequently mobile. Furthermore, it is just as important to address the shared requirements of each category as it is to ad dress the nuances that exist within each group.







THE IMPACT ON ADMINISTRATORS

Hyper-connected, always-on employee mentalities make UC mobility a necessity. However, the complexity of mobile UC management can strain IT and administrator resources. Many UC platforms and services provide a number of disparate management and provisioning tools for various mobility applications (i.e., mobile soft clients, teleworker, wireless devices, IM chat and presence, and conferencing).

In addition, many cloud UC services simply don’t offer the array of options that companies need to properly equip their diverse mobile workforce. As a result, management and administration processes are commonly tedious, time consuming and error prone. For users, this often means delays in access to tools, inability to personalize their experiences, and hesitancy to adopt. Such complications are most often an artifact of mobility being prioritized as an after thought by the UC solution developer.

UC mobility management doesn’t need to be difficult, even for diverse scenarios. There are solutions that handle mobility requirements with the same priority as tethered devices and applications. Therefore, administrators seeking mobile UC solutions should demand simplicity without sacrificing control.

 

 

CONCLUSION

Smart phones and tablets are often top of mind when it comes to UC mobility, but they are not the best fit for all employee roles and tasks. There are a number of devices and interfaces that alone or in conjunction with smart devices provide better experiences to help employees remain connected, engaged and productive by optimizing their access to unified communications functionality. UC mobility best practices also extend beyond the users to the admin experience through delivery of a single, easy-to-use management tool set for all applications without sacrificing control.

An Interview With Param Bedi, Vice President,
Library and Information Technology,

Bucknell University




Param Bedi

Vice President
Library and Information Technology

Bucknell University
 






Frost & Sullivan recently spoke with Param Bedi, Vice President of Library and Information Technology at Bucknell University, about some of his key digital strategies, initiatives and challenges. Here are some of the insights gleaned from our brief but interesting discussion.

Frost & Sullivan: How did you get into the library and technology information space?

 
Param Bedi: Well, I received a Master's degree in Computer Information Systems and an M.B.A. in Finance from Temple University… but higher education is my true passion.

Frost & Sullivan: How do you deal with challenges around intellectual property and copyright enforcement?

Param Bedi: It all boils down to education -- Bucknell University does have a copyright officer, but he functions as more of a consultant (and occasional policeman). Overall, what we strive to do is educate the faculty and students on the policies in place, to make sure everyone understands what copyright is, for instance.

Overall, Bucknell takes a multi-pronged approach to intellectual property and copyright issues and works with the faculty as well.  In addition to guiding students on what they need permission to reprint or use for their research and course projects, librarians try to make sure students understand the concept of “fair use.”

Fair use is a legal doctrine stating that portions of copyrighted materials may be used without permission of the copyright owner provided the use is fair and reasonable, does not substantially impair the value of the materials, and does not curtail the profits reasonably expected by the owner. Given that many of the academic projects that students create are available to the world (video clips, blogs, etc.), it’s important for them to understand and respect copyright and the importance it has in their academic work.


Frost & Sullivan: What are your thoughts on Digital Rights Management (DRM)?

(DRM is a systematic approach to copyright protection for digital media. DRM’s purpose is to prevent the unauthorized dissemination of digital media and includes restrictions around copying content that has been purchased.)

Param Bedi: DRM in some ways restricts fair use for higher education purposes. Using digital content for learning purposes is very different, from say, copying and showing a movie. Libraries have been buying and sharing books for centuries and DRM changes that model. DRM technology needs to preserve the role of the library and fair use.  Overall, I believe the DRM model needs to be refined.

 
Frost & Sullivan: What key strategic issues are you focused on?
 

Param Bedi: I have three major initiatives:
  1. Business intelligence analytics -- We created a new team for our business intelligence initiative. The project has established a data warehouse to facilitate a new open access model that looks at student data holistically rather than as data silos of the registrar, admissions and financial aid offices, and institutional research.
  2. Open Education Resources -- Bucknell established a Presidential Task Force on Open Educational Resources and Residential Learning to investigate potential opportunities for achieving Bucknell's educational goals. We have since devoted significant resources in our Instructional Technology division to supporting faculty exploration of open educational resources.
    Bucknell faculty also passed an Open Access policy in October 2011 in which the faculty  make all peer-reviewed journal articles they publish open access wherever possible.
  3. Digital Scholarship -- Since 2008, we have more than doubled staff on the Instructional Technology team. Their model has shifted from supporting faculty to partnering with faculty. We hired staff with deep expertise in geographic information systems, digital media and video editing, and digital scholarship, and faculty often ask these staff members to co-teach courses. We also created a Digital Scholarship Center in the library where faculty can experiment with new instructional technologies and meet with instructional technologists. 

Additionally, 2015 marks the second year in which Bucknell University has hosted a Digital Scholarship conference.  Last year, faculty and students from over fifty schools around the country attended to present their work and to learn about our digital scholarship efforts from our faculty, students, and instructional technologists.  This year’s conference will be held in the first week of November, http://dsconf.blogs.bucknell.edu.


Frost & Sullivan: Is there a “portable library” in the foreseeable future? What new library-related technologies interest you?

Param Bedi: Technology will continue to evolve.  We are not very focused on any specific technology but more around using technology to implement important strategies. As far as a ‘portable library’ is concerned, our physical library is busier than ever! In fact, foot traffic at the library is up by 37% in the last decade.

Bucknell’s library is still the intellectual center of the campus. Students go there looking for collaboration, resources, and expert help. They are consulting with the librarians and instructional technologists and not just asking “quick” questions.  We are creating spaces for them to do just that.  We have quiet spaces, collaborative spaces, and computer labs in the library, along with the experts to work with the students on their academic projects.


Frost & Sullivan: Security concerns? Are they as pervasive as they are in the private sector? Also, your thoughts on “The Right to Be Forgotten,” a user-privacy cause against Google recently upheld in a European court?
 

Param Bedi: Security is a big concern; incidents seem to be on the rise, and providing and maintaining security uses up a lot of resources. In short, universities are big targets.

In terms of ‘The Right to Be Forgotten,’ I believe that Europe is often ahead of the U.S. when it comes to privacy laws, so this cause will be interesting to watch…

Prior to joining Bucknell University, Param served as Vice President for Technology and Library and Chief Information Officer at Arcadia University in Glenside, Pennsylvania. At Arcadia, he also taught courses in finance, management information systems, math, and educational technology and was an adjunct faculty member.

Friday, August 7, 2015

Is Meeting Regulations Really Enough When It Comes To Security?



 

By Ron Dinwiddie

Chief Information Officer
Texas Trust Credit Union






Being in a heavily regulated industry, we have an obligation to comply. It is understandable that the regulatory burden often leads some to consider doing the bare minimum to get through the next audit. When faced with an overwhelming number of requirements, we are tempted to calculate the minimum our team needs to do in order to be in compliance and avoid a finding.

As a former consultant and now CIO for my fourth financial institution, I have experienced policies, procedures, and practices that represented the bare minimum needed to satisfy requirements. The reasons for that were either “we don’t have the time to do it better” or “the auditors and examiners didn’t ding us, so it must be OK.”

Maybe this is acceptable in some areas, but how about when it comes to your data and network security?

I am in touch with other IT senior leaders that include CEO, CIO and CISSO and clearly there is great interest in this industry about how to provide a higher level of security for our credit unions. Topics include multi-level security implemented at the brick and mortar level as well as the newest and most difficult to control – mobile devices. Why are mobile devices so difficult? At the brick and mortar level we can control what security solutions, policies and procedures are implemented but we have absolutely no control over what our members (customers to non-credit union industries) implement on their own device(s). We can make our mobile banking app secure, but if the member saves their login ID and password on their device and then their device is compromised so is their banking account.

In my discussions with other industry leaders, we all understand that just meeting regulations was not enough. Several of my peers have stated they were implementing the SANS Top 20 Critical Controls. So why are we all thinking this way?

  • There are multiple regulatory compliance bodies overseeing various industries and they don’t all provide the same guidance or requirement levels, suggesting one or more of these guides is missing something or the developers of the guidance have a different idea about what is most important.
  • These regulatory bodies are mostly reactive; once a vulnerability is identified they then develop the regulation, have it reviewed and approved, and publish it to their constituents. This takes time and leaves us vulnerable if we merely adhere to their publications.
  • Most regulations, though not all, are geared towards a specific industry, such as credit unions. But those of us in IT understand that bad guys use some of the same tactics from one industry to another to gain access.
  • It’s very difficult for regulatory bodies to draft a regulation that fits every environment. Not all credit unions have the same network structure, support staff, or ability to implement security solutions. A $40 million credit union doesn’t have the same resources as a $4 billion one, so regulations are designed and written to address organizations of all sizes.
As far as why some IT shops don’t do security as well as they could, let's look at the first excuse, that "we don’t have the time to do it better.” I would ask them “do you have the time to identify, counter, and remediate a network or data breach?” And how much time does it take for one of your IT staff to research and work their way through finding out how to fix a problem when your expert on that particular system or area within IT is not available as opposed to having your Subject Matter Expert (SME) develop proper procedures so their backup can easily follow them to fix a problem?

If there are loopholes in your policies because they meet the bare minimum requirement, of course you will get compromised. Using lack of time as the reason for not doing things in the best manner possible is inexcusable. By blocking out dedicated time each week to work on these items, and having your direct reports do the same, you will make progress.

"As to the second excuse, that "the auditors and examiners didn’t ding us, so it must be OK," auditors and examiners have checklists to follow. And since some of them are auditing and examining multiple departments, their level of expertise is somewhat limited in one or more of those areas. IT audits and exams are, perhaps, the most difficult. Most auditors and examiners don’t come from an IT background; they get training and look for specific words or phrases in policies and procedures and certain types of software and hardware settings when they come onsite. Your customers – members in the credit union world – deserve more. They deserve the best security you can provide for their personal information.

Think about airports and how many people complain about the TSA security and how it slows everything down. But if those agents slacked off and let someone through who caused harm to people in one way or another, everyone would then scream about how TSA failed to catch them. Think about how many of your users, and in some cases members, complain about your security measures. What would those same people say if it was their personal information that was compromised because you lowered your security standards just to make them happy?

Here are some facts, as reported in Homeland Security/FBI communications I receive, concerning security threats and breaches:

  • BP reports it suffers 50,000 attempts of cyber-intrusion every day;
  • The Pentagon reports 10 million attempts every day;
  • The National Nuclear Security Administration records 10 million attacks every day;
  • Attackers average 205 days inside an environment before they are discovered;
  • 69% of victims learn from a third party that they have been compromised; and
  • Healthcare has become a much higher target than financial institutions because their records contain more personal information and the black market has become flooded with compromised debit/credit cards.
Most auditors and examiners you encounter will readily agree that meeting regulations may be the minimum that you should do, but as a responsible senior IT manager you should constantly review and upgrade your security. There are many organizations you can join and become part of to help keep your security knowledge up-to-date. These organizations include the FBI, Department of Homeland Security, InfraGard and FS-ISAC for financial institutions.

Ron Dinwiddie’s 42-year career in IT has spanned most areas of IT in a wide-ranging variety of industries. Ron started his IT career in the United States Navy working with mainframes as an operator, moving into programming, networking, system administration and security before retiring after 22 years on active duty.

After the Navy Ron became a Unix instructor and consultant before moving into his first CIO role with a financial institution. Ron continued to expand his area of influence by moving to other financial institutions requiring his expertise in rebuilding their IT infrastructure and restructuring the IT services to provide the highest quality of service to end users. Ron also developed and updated Information Security policies and procedures so they complied with regulatory compliance standards.