From Malware To Malevolent People

So in virus protection on the computer, there are 2 common ways antivirus software works:


1) Signature Detection – There are known patterns of viruses and the antivirus software looks for a match against one of these. 


2) Behavior Detection – There are known patterns of normal behavior on the computer, and the antivirus software looks for deviations from this. 


Each has certain weaknesses:


– With signature detection, if there is a zero-day exploit (i.e. a virus that is new and therefore which has no known signature) then it will not be caught by a blacklist of known viruses.


– While with behavior detection, some viruses that are designed to look like normal network or application behavior will not be caught by heuristic/algorithm-based detection methods. 


For defense-in-depth then, we can see why employing a combination of both methods would work best to protect from malware. 


It’s interesting that these same techniques for recognizing bad computer actors can be used for identifying bad or dangerous people. 


We can look for known signatures/patterns of evil, abusive, and violent behaviors and identify those people according to their bad actions.


Similarly, we generally know what “normal” looks like (within a range of standard deviations, of course) and people who behave outside those bounds could be considered as potentially dangerous to themselves or others. 


Yes, we can’t jump to conclusions with people — we don’t want to misjudge anyone or be overly harsh with them, but at the same time, we are human beings and we have a survival instinct. 


So whether we’re dealing with malware or malevolent individuals, looking at patterns of bad actors and significant deviations from the normal are helpful in protecting your data and your person. 😉


(Source Photo: Andy Blumenthal)

Cybersecurity Vulnerabilities Database

Cybersecurity.jpeg

There is a very useful article in Bloomberg about how the U.S. is taking too long to publish cybersecurity vulnerabilities. 


And the longer we take to publish the vulnerabilities with the patch/fix, the more time the hackers have to exploit it!


Generally, the U.S. is lagging China in publishing the vulnerabilities by a whopping 20-days!


Additionally, China’s database has thousands of vulnerabilities identified that don’t appear in the U.S. version. 


Hence, hackers can find the vulnerabilities on the Chinese database and then have almost three weeks or more to target our unpatched systems before we can potentially catch up in not only publishing but also remediating them. 


Why the lag and disparity in reporting between their systems and ours?


China uses a “wider variety of sources and methods” for reporting, while the U.S. process focuses more on ensuring the reliability of reporting sources–hence, it’s a “trade-off between speed and accuracy.”


For reference: 


The Department of Commerce’s National Institute of Standards and Technology publishes the vulnerabilities in the National Vulnerability Database (NVD).


And the NCD is built off of a “catalog of Common Vulnerabilities and Exposures (CVEs) maintained by the nonprofit Mitre Corp.”


Unfortunately, when it comes to cybersecurity, speed is critical.


If we don’t do vastly better, we can be cyber “dead right” before we even get the information that we were vulnerable and wrong in our cyber posture to begin with.  😉


(Source Photo: Andy Blumenthal)

Never Ever More Vulnerable

Vulnerable.jpeg

So we have never been more technology advanced. And at the same time, we have never been more vulnerable


As we all know, our cybersecurity have not kept near pace with our ever growing reliance on everything technology.


There is virtually nothing we do now-a-days that does not involve networks, chips, and bits and bytes. 


Energy

Transportation

Agriculture

Banking

Commerce

Health

Defense

Manufacturing

Telecommunications


If ANYTHING serious happens to cripple our technology base, we are toast!


From a crippling cyberattack that disables or hijacks our systems, steals or locks down our data, or creates massive chaotic misinformation flow to a EMP blast that simply fries all our electronic circuitry–we are at the mercy of our technology underpinnings. 


Don’t think it cannot happen!


Whether it’s Wannacry ransonware or the Equifax breach of our privacy data or the Kaspersky Labs hidden backdoor to our top secret files or North Korea threatening to hit us with an EMP–these are just a few of the recent cyber events of 2017!


Technology is both a blessing and a curse–we have more capability, more speed, more convenience, more cost-effectiveness than ever before, but also there is greater vulnerability to complete and utter death and destruction!


This is not just a risk that life could become more difficult or inconvenient–it is literally an existential threat, but who wants to think of it that way?


People, property, and our very society is at risk when our cybersecurity is not what it must be.


It’s a race of defensive against offensive capability. 


And we can’t just play defense, we had better actually win at this! 😉


(Source Photo: Andy Blumenthal)

Turn, Press, Pull — Gonna Get Ya

Controls.jpeg

So as I go around town, I see more and more of these industrial-type control panels. 

The problem is that they are stupidly in the open and unprotected or otherwise easily defeated.  

While probably not a serious threat of any sort, this one apparently is a unit to control some fans in an underground garage open to the public. 

You see the knobs you can just turn.

And one with a yellow warning sticker above it.

As if that will keep someone with bad intentions from messing with it. 

You also see the red and yellow lights…hey. let’s see if we can make those flash on, off, on.

Panel 13, nicely numbered for us–let’s look for 1 to 12 and maybe 14+.

It just continues to amaze me that in the age of 9/11 and all the terrorism (and crime) out there that many people still seem so lackadaisical when it comes to basic security. 

Anyone in the habit of leaving doors and gates open, windows unlocked, grounds unmonitored, computers and smart phones without password protection, data unencrypted and not backed up, even borders relatively wide open, and so on. 

Of course, we love our freedom and conveniences.

We want to forget bad experiences.

Could we be too trusting at times?

Maybe we don’t even believe anymore that the threats out there are impactful or real.

But for our adversaries it could just be as simple as finding the right open “opportunity” and that’s our bad. 😉

(Source Photo: Andy Blumenthal)

I Like That Technology

I Like That Technology

Christopher Mims in the Wall Street Journal makes the case for letting employees go rogue with IT purchases.

It’s cheaper, it’s faster, “every employee is a technologist,” and those organizations “concerned about the security issues of shadow IT are missing the point; the bigger risk is not embracing it in the first place.”

How very bold or stupid?

Let everyone buy whatever they want when they want–behavior akin to little children running wild in a candy store.

So I guess that means…

Enterprise architecture planning…not important.
Sound IT governance…hogwash.
A good business case…na, money’s no object.
Enterprise solutions…what for?
Technical standards…a joke.
Interoperability…who cares?
Security…ah, it just happens!

Well, Mims just got rids of decades of IT best practices, because he puts all his faith in the cloud.

It’s not that there isn’t a special place for cloud computing, BYOD, and end-user innovation, it’s just that creating enterprise IT chaos and security cockiness will most-assuredly backfire.

From my experience, a hybrid governance model works best–where the CIO provides for the IT infrastructure, enterprise solutions, and architecture and governance, while the business units identify their specific requirements on the front line and ensure these are met timely and flexibly.

The CIO can ensure a balance between disciplined IT decision-making with agility on day-to-day needs.

Yes, the heavens will not fall down when the business units and IT work together collaboratively.

While it may be chic to do what you want when you want with IT, there will come a time, when people like Mims will be crying for the CIO to come save them from their freewheeling, silly little indiscretions.

(Source Photo: Andy Blumenthal)

Remodulate The Shields For Cyber Security

I really like the concept for Cyber Security by Shape Security.

They have an appliance called a ShapeShifter that uses polymorphism to constantly change a website’s code in order to prevent scripted botnet attacks–even as the web pages themselves maintain their look and feel.

In essence they make the site a moving target, rather than a sitting duck.

This is like Star Trek’s modulating shield frequencies that would prevent enemies from obtaining the frequency of the shield emitters so they could then modify their weapons to bypass the shield and get in a deadly attack.

In real life, as hackers readily change their malware, attack vectors, and social engineering tactics, we need to be agile and adapt faster than the enemy to thwart them.

Changing defense tactics has also been used by agencies like Homeland Security to alter screening methods and throw potential terrorists off from a routine that could be more easily overcome.

I think the future of IT Security really lies in the shapeshifter strategy, where the enemy can’t easily penetrate our defenses, because we’re moving so fast that they can’t even find our vulnerabilities and design an effective attack before we change it and up our game again.

And hence, the evil Borg will be vanquished… 😉

Learning IT Security By Consequences

This is a brilliant little video on IT Security.

What I like about it is that it doesn’t just tell you what not to do to stay safe, but rather it shows you the consequences of not doing the right things.

Whether you are letting someone into your office, allowing them borrow your badge, leaving your computer unsecured, posting your passwords, and more–this short animated video shows you how these vulnerabilities will be exploited.

It is also effective how they show “Larry” doing these security no-no’s with signs everywhere saying don’t do this.

Finally, the video does a nice job summing up key points at the end to reinforce what you learned.

I think that while this is simpler than many longer and more detailed security videos that I have seen, in a way it is more successful in delivering the message in a practical, down-to-earth approach that anyone can quickly learn core basic practices from.

Moreover, this video could be expanded to teach additional useful IT security tips, such as password strengthening, social engineering, and much more.

I believe that even Larry, the unsuspecting office guy, can learn his lesson here. 😉

(Note: This is not an endorsement of any product or service.)

Cloud $ Confusion

Grab_a_cookie

It seems like never before has a technology platform brought so much confusion as the Cloud.No, I am not talking about the definition of cloud (which dogged many for quite some time), but the cost-savings or the elusiveness of them related to cloud computing.

On one hand, we have the Federal Cloud Computing Strategy, which estimated that 25% of the Federal IT Budget of $80 billion could move to the cloud and NextGov (Sept 2012) reported that the Federal CIO told a senate panel in May 2011 that with Cloud, the government would save a minimum of $5 billion annually.

Next we have bombastic estimates of cost savings from the likes of the MeriTalk Cloud Computing Exchange that estimates about $5.5 billion in savings so far annually (7% of the Federal IT budget) and that this could grow to $12 billion (or 15% of the IT budget) within 3 years, as quoted in an article in Forbes (April 2012) or as much as $16.6 billion annually as quoted in the NextGov article–more than triple the estimated savings that even OMB put out.

On the other hand, we have a raft of recent articles questioning the ability to get to these savings, federal managers and the private sector’s belief in them, and even the ability to accurately calculate and report on them.

Federal Computer Week (1 Feb 2012)–“Federal managers doubt cloud computing’s cost-savings claims” and that “most respondents were also not sold on the promises of cloud computing as a long-term money saver.”

Federal Times (8 October 2012)–“Is the cloud overhyped? predicted savings hard to verify” and a table included show projected cloud-saving goals of only about $16 million per year across 9 Federal agencies.

CIO Magazine (15 March 2012)–“Despite Predictions to the Contrary, Exchange Holds Off Gmail in D.C.” cites how with a pilot of 300 users, they found Gmail didn’t even pass the “as good or better” test.

ComputerWorld (7 September 2012)–“GM to hire 10,000 IT pros as it ‘insources’ work” so majority of work is done by GM employees and enables the business.

Aside from the cost-savings and mission satisfaction with cloud services, there is still the issue of security, where according to the article in Forbes from this year, still “A majority of IT managers, 85%, say they are worried about the security implications of moving to their operations to the cloud,” with most applications being moved being things like collaboration and conferencing tools, email, and administrative applications–this is not primarily the high value mission-driven systems of the organization.

Evidently, there continues to be a huge disconnect being the hype and the reality of cloud computing.

One thing is for sure–it’s time to stop making up cost-saving numbers to score points inside one’s agency or outside.

One way to promote more accurate reporting is to require documentation substantiating the cost-savings by showing the before and after costs, and oh yeah including the migration costs too and all the planning that goes into it.

Another more drastic way is to take the claimed savings back to the Treasury and the taxpayer.Only with accurate reporting and transparency can we make good business decisions about what the real cost-benefits are of moving to the cloud and therefore, what actually should be moved there.While there is an intuitiveness that we will reduce costs and achieve efficiencies by using shared services, leveraging service providers with core IT expertise, and by paying for only what we use, we still need to know the accurate numbers and risks to gauge the true net benefits of cloud.It’s either know what you are actually getting or just go with what sounds good and try to pull out a cookie–how would you proceed?(Source Photo: Andy Blumenthal)

			

SDLC On Target

Sdlc

I found this great white paper by PM Solutions (2003) called “Selecting a Software Development Life Cycle (SDLC) Methodology.”

The paper describes and nicely diagrams out the various SDLC frameworks:

– Waterfall
– Incremental
– Iterative
– Spiral
– RAD
– Agile

It also provides a chart of the advantages and disadvantages of each framework.

Finally, there is a simple decision cube (D3) based on time horizon, budget, and functionality for selecting an SDLC framework.

This is a very useful and practical analysis for implementing SDLC, and it aligns closely with the guidance from the National Institute of Science and Technology (NIST) Special Publication (SP) 800-64, “Security Considerations in the Systems Development Life Cycle” Appendix E that states:

“The expected size and complexity of the system, the development schedule, and the anticipated length of a system’s life may affect the choice of which SDLC model to use.”

While NIST focuses on the time horizon and complexity versus the PM Solutions Decision Cube that uses time horizon, budget, and functionality, the notion of tailoringSDLC to the project is both consistent and valuable.

Just one more resource that I found particularly good is the Department of Labor IT Project Management guidance (2002)–it is a best practice from the Federal CIO website.

I like how it integrates SDLC, IT Project Management, IT Capital Planning and Investment Control (CPIC), and security and privacy into a cohesive guide

It also establishes project “thresholds” to differentiate larger or more significant projects with greater impact from others and calls these out for “more intensive review.”

Even though these these resources are around a decade old, to me they are classic (in a good sense) and remain relevant and useful to developing systems that are on target.

(Source Photo: Andy Blumenthal)

Running IT as an Ecosystem

Ecosystem

The New York Times (27 November 2011) has an interesting article under “bright ideas” called Turn on the Server. It’s Cold Outside.

The idea in the age of cloud and distributed computing, where physical location of infrastructure is besides the point, is to place (racks of) servers in people’s homes to warm them from the cold.
The idea is really pretty cool and quite intuitive: Rather than use expensive HVAC systems to cool the environment where servers heat up and are housed, instead we can use the heat-generating servers to warm cold houses and save money and resources on buying and running furnaces to heat them.
While some may criticize this idea on security implications–since the servers need to be secured–I think you can easily counter that such a strategy under the right security conditions (some of which are identified in the article–encrypting the data, alarming the racks, and so on) could actually add a level of security by distributing your infrastructure thereby making it less prone to physical disruption by natural disaster or physical attack.
In fact, the whole movement towards consolidation of data centers, should be reevaluated based on such security implications.  Would you rather have a primary and backup data center that can be taken out by a targeted missile or other attack for example, or more distributed data centers that can more easily recover. In fact, the move to cloud computing with data housed sort of everywhere and anywhere globally offers the possibility of just such protection and is in a sense the polar opposite of data center consolidation–two opposing tracks, currently being pursued simultaneously.
One major drawback to the idea of distributing servers and using them to heat homes–while offering cost-saings in term of HVAC, it would be very expensive in terms of maintaining those servers at all the homes they reside in.
In general, while it’s not practical to house government data servers in people’s homes, we can learn to run our data centers more environmentally friendly way. For example, the article mentions that Europe is using centralized “district heating” whereby more centralized data center heat is distributed by insulated pipes to neighboring homes and businesses, rather than actually locating the servers in the homes.
Of course, if you can’t heat your homes with data servers, there is another option that gets you away from having to cool down all those hot servers, and that is to locate them in places with cooler year-round temperatures and using the areas natural air temperature for climate control. So if you can’t bring the servers to heat the homes, you can at least house them in cold climates to be cooled naturally.  Either way, there is the potential to increase our green footprint and cost-savings.
Running information technology operations with a greater view toward environmental impact and seeing IT in terms of the larger ecosystem that it operates in, necessitates a careful balancing of the mission needs for IT, security, manageability, and recovery as well as potential benefits for greater energy independence, environmental sustainability, and cost savings, and is the type of innovative bigger picture thinking that we can benefit from to break the cycle of inertia and inefficiency that too often confronts us.
(Source Photo: here)