Thoughts of an Angel
As one who considers herself a spiritual person, as was raised as such, I'd like to think that I am fairly well-educated about most theological matters.  I find it quite important to know what you believe and why.  If nothing else, college has given me opportunity upon opportunity to "sharpen my claws," as it were.  The same concept applies to computers.  If you think that sounds cheesy, you may be right.  After all, who would think that the belief you may base your values and morals on is akin to technological beliefs?  Don't believe me?  Try telling a Mac user that they may even be *remotely* wrong and you may have a technological Jihad on your hands.  Personally, I don't mind debate, and even welcome it - so long as one is well-informed about their subject matter.  What I believe that few people realize is that when they argue Mac vs. PC, they are essentially arguing over whether to have green beans or strawberries for dinner.  They reach out to different target markets with little overlap.

The Church of Macintosh - I have to hand it to Apple.  They know their market.  Many of its consumers buy their products because they want a computer "they can just use."  So, Apple heard the cries of their consumers and built one of the most stable OS's around - the Macintosh OS.  You don't have to think to use it.  That's the beauty of it.  One of the primary factors leading to its stability is the proprietary hardware, Apple's middle finger stuck out to you daring you to not try to install its OS on any other hardware.  (Hackintosh to the rescue!) I see that anyone who tries to use it beyond its capability (which isn't saying much) happens to be the one to experience problems with it. 
Then we have the all-inclusive Apple Care.  If you were able to actually afford the machines in the first place, you have to pay more for the AppleCare - the only way you're going to get the computer fixed for cheap, if not free.  Now, I have heard no complaints about AppleCare.  In fact, I hear nothing but praises from them.  Still, one would think that after paying the nearly 50% premium for a Mac, they would just get the AppleCare with the product.
The other reason people love the Mac so much is that it doesn't get viruses.  A quick read of this FierceCIO article and a US-Cert site search will tell you otherwise.  The only reason they don't get many viruses now is because of their low market share.  While their individual market share is increasing, their corporate share is lost to Microsoft. Corporations are who hackers generally want to go after.

The Microsoft Cathedral - While Apple is busy reaching out to the average everyday tech consumer, Microsoft is focusing on one of its strengths - business software.  Their Azure program was released primarily for businesses (if you want a private cloud, just get a VM).  After being at the TechEd Conference and through the Imagine Cup IT Challenge competition, I have found that it is possible to run your entire home, business and IT architecture off of Microsoft products alone.  I'd like to see any other proprietary-based software company do that.  This is why Microsoft has a majority of the corporate market share.
One of the things that I believe Microsoft suffers from is the same as Google with their Android phones - hardware inconsistency.  If you want to build a computer from scratch, you build a Windows computer.  Within reason, you can customize it right down to the amount of circuits you want on your motherboard, and if your mobo was properly built, Windows will still run on it.  Fantastic and terrible at the same time.  The only real trouble with such hardware flexibility is that with inconsistent specifications, you're going to get inconsistent software behavior.  All an OS is is a GUI interface between the user and the hardware.  So, it is up to Microsoft to continue broadening their OS's capability to communicate with different types of hardware.
Amazingly enough, Microsoft only now started including anti-virus software with its Windows 7 machines (not that it didn't need it). 
While Apple is indeed catching up in the gaming department, so far, nothing's been able to beat Windows Aero in the graphics viewing - a gamer's dream come true!  =)

Linux - What?  No church name?  Linux does not need that purely because there is no standard to measure against.  What I mean by that is that if I tried to compare the Linux OS, then I would have to compare all distributions - maybe that'll be another post.  These OS's are more for the saavy who just likes to make their own things - which I admire.  I like the spirit of the developers.  Heck, I've been contemplating turning this laptop into a Linux box.  We'll see about that though. 
Tl;dr - Don't bother with Linux unless you a) *really* don't want to think about what you're clicking (might I suggest Ubuntu or Mandriva?) or b) can do nothing *but* think about what you're doing (you'd love back-trac, RedHat, Fedora, etc). 

In the end, it's all a matter of personal taste and needs.  A Mac is going to be better at fulfilling some needs more than Microsoft is and vice-versa. 
Due to some needed correction, I have redone the test and made sure that they both came from the same servers (thanks, Billy!).
Google Chrome

Eh...they have their trade-offs, it looks like. 

My guess would be that given that we were on a Microsoft network in Atlanta, that something was configured specifically for the browser, IE9.  Still, "kudos" to Microsoft for making their browser a great deal better over time. 

I still maintain that Bing makes for an awful verb. 
Professional Tip of the Day: Now go Bing yourself.
I am in Texas again.   Here are the results:
Google Chrome


'Nuff said.

As for location, I am currently in Arlington, the sweaty and slow armpit of internet connectivity.  I only have to sit on a different side of a couch before my phone begins roaming.  And well, the router was $15 at a garage sale...and it's AT&T service (can someone say *blegh!*?).

I also noticed that IE9 had to draw from a different server, which I find interesting. 

Anyhow, g'day!
This session discussed the basics of capturing and analyzing network packets using Wireshark.  

First they briefly discussed potential legal issues.  For instance, one should be aware of the local and national laws concerning computer technology and cyber security.  One must have permission to capture and review traffic for purposes of troubleshooting, optimization, security, and application analysis.  I do believe that has to be permission in writing too...

One must know their chain of custody and create SHA1, RIPEMD160, or MD5 hashes of trace files one plans on using as evidence with capinto - the command line for Wireshark.

I also learned that most bot-infected hosts and their Command and Control (C and C) servers can be detected by capturing and analyzing DNS responses.

One of the key points that is important to note is that to know whether or not your server is being attacked is to know how your servers normally behave in the first place.  

Another point discussed was the responses of host-based firewalls.  Many of them simply come back with an ICMP reponse when it detects an invalid host attempting to access it.  However, a host-based firewall should NEVER actually send anything back, but just drop the connection if it is suspicious.  In other words, do not violate the #1 rule of the internet - do not feed the trolls.

Also, ARP does not get past routers for a lack of an IP header - no identification, no access.

Some Active Discovery Processes were discussed here:
  • ARP Scan - local only; can find << hidden >> hosts
  • Ping scan - ICMP type 8/0
  • ACK Scan - TCP ACK - check firewall rules
  • FIN Scan - FIN - illogical TCP fram
  • Xmas Scan - FIN PUSH URG
  • Null Scan - No flags set
  • Maimon Scan - FIN/ACK
  • Idle Scan - Uses zombie; watches IP ID value
  • TCP Port Scan - stealth of full
  • UDP Port Scan - listening for ICMP responses
  • OS Fingerprinting Scan - TCP, UDP, ICMP Probes

Remember, the difference between reconnaissance and a breach is what they are used for.
  • Here are some of the signatures of traffic:
  • Unusual ports in use
  • Unusual protocols in use
  • High TCP "data" rate/Undissected traffic
  • Unusual conversation pairs
  • Unusual endpoints
  • High number of application failures/error responses
  • Higher-than-normal traffic rates
  • Higher-than-normal conversations per user
  • Traffic to/from illegal MAC or IP address

I also learned that a dark MAC address or dark IP address is a bogus address packet treated as a broadcast.  Consequently, the router simply keeps flooding the network in search of a machine that matches the MAC and/or IP address, but of course, never finds it.

I also learned how to create Coloring Rules.  Essentially, you tell it to find certain packets that meet a certain condition.  If it does, then you will notice it highlighted as the color you assign to that. One suggestion is to assign your largest threats the color(s) that is/are most aggravating and/or certain to catch your attention.

And finally, I leave you with two pieces of advice:  
1) Try to stay away from using "!=" in your filter.  Instead, opt for "!<insert condition here> = <insert compared condition here>"  For instance, instead of:



2) When you look at your trace files, it will come up as several characters.  Just know that the combination letters "MZ" should be treated as an executable file, because it is.  Be very careful with this, however

More to come later today!
With 10,000 attendees from 84 countries, 800 Microsoft participants, Microsoft TechEd 2011 is hosting 551 unique sessions and 250 hands-on labs (among other things).  As of a bit after 3:00pm, here is my summary of the day:

We (the bloggers and Imagine Cup team) walked in right before the sounds of The Glitch Mob played masterfully as our pre-show entertainment.  Our Imagine Cup team stood up to the sound of applause shortly after talking about their successful project which involved portable medical imaging/ultrasounds in order to give much less expensive access to diagnostic health care for people who are unable to afford it.

Robert Wahbe, CVP Server and Tools was our keynote speaker who talked about many applications of both Public and Private Cloud that included extending existing applications, dealing with large data sets and data warehousing, reaching larger capability of high performance computing, better opportunities for promotion of events and content distribution, and better using the Cloud for marketing campaigns and gaming web sites.

Several demos were put on that I found quite interesting.  
Joey Snow demonstrated a few Cloud services such as requesting Private Cloud capacity, deploying from the System Center via a New VMM Service Deployment, and Public Cloud deployment.
Amir demonstrated one of the ways that the Cloud can be used as a Business Intelligence System by using PowerPivot to create full spreadsheet, database, and graphic functionality.  For those nay-sayers who believe the Cloud is not capable of good speed - think again.  In the time that it takes to blink your eyes, he performed a query on a database consisting of 2 billion records, retrieving a bit more than a million of said record matching his query.
Augusto Valdez demonstrated Cloud-Based Productivity via Windows Phone 7 and its ability to sync with its PC-based software via the Cloud.  He showed us how to sync with Outlook as well as Lync via Lync Mobile.  Finally, he showed us the e-mail security capabilities that one can use on Windows Phone 7.
Edwin Yuen presented what was perhaps my favorite demo - the Worldwide Telescope using the X-Box Kinect.  He was able to show us a literal real-time view of events and objects such as the greatest solar eclipse that will ever happen in our lifetimes in 2014 as well as the entirety of the known universe.
Cameron Skinner discussed managing the life cycle of applications using the example of utilizing the Cloud for communication between the Operations side of IT (Infrastructure) and Developers to meet the needs of the customer, understand the requirements, and agree on the priorities of the application.
There was one more demonstration of making an application to address how a call center assigns tickets to technicians.

After wandering about the Convention Center for a while (this is a HUGE place with SO much to do!  You really should be here!), I went to a session on "Wiretapping."  It is a basic how-to session on using Wireshark to capture and analyze traffic.  This is discussed in the next entry if you're interested...