Enterprise Version X

What No EDI?

Yes Germonio this is no longer Kansas.  I am amazed and maybe haven’t been keeping abreast of what is actually happening in the so called Enterprise world (even though I have been working in it for a while).  Here is my confusion.  If you take the definition of Electronic data interchange (EDI) one will find that it is “It is used to transfer electronic documents or business data from one computer system to another computer system without human intervention.”  So given that (and here is where I may be getting confused) many companies for several disparate reasons are requesting that they want to scale – writ large – these batch processing EDI systems – without changing anything – just by adding a SOA layer on top of the existing RDBMS architecture.  Really?  Now recently I wrote a blog about map reduction of machine learning algorithms and mentioned NoSQL architectures however  Nathan Hurst wrote a great blog and took a page out of Brewer’s CAP theorem for a visualization of NoSQL and RDBMS:

 

Where most of these companies and entities get into problems is not wanting to change the fundamental data model.  Many of these “conversions” or the “wanting to convert” start with understanding the data that presides in these decades old RDBMS or the basic process in general of what is happening within these enterprise systems:

 

Basically we are running into a problem where the fundamental issues are:  (1) the data model (2) creation of a set of canonical rules (3) choosing an architecture based on the triangle mentioned above.  Most companies do not want to take stock in understanding from a programmatic perspective what is in the data base.  Most companies keep adding data, stored procedures, key value pairs and columns and call that scale.  Then they realize that 7M, 10M, 50M, 200M people are going to hit this system in a stochastic manner.   We have to hire more analysts, more coders, more people.  That is not scale.  Scale is creating a canonical set of rules, reducing the data model to a core set of attributes and creating machine learning for data hygiene and integrity.  Then naturally most people believe or inquire as to the data accuracy.  Well it is a give and take, humans are not 100% accurate 100% percent of the time.  I would rather have an 80/20 rule and get the data results faster then operate on the focus of 100% correct data 100% of the time.  It is more natural to operate on the the 20% outliers and re-train the system based on those anomalies. Remember the first step is acknowledgement of the actual problem.   As I said in an much earlier blog on adapatibility that homeostasis is the death knell of most companies.  You must build an evolution or revolutionary strategy into your technology roadmap or at least plan with some contingencies.  Thus this brings into question what exactly is enterprise software?

Until Then,

Go Big Or Go Home!

//ted

Considerations for NoSQL and Map Reducing Machine Learning Algorithms

The Machine Is Learning

The past couple of weeks have been rather tumultus for me and several others.  I wont go into details and “kiss and tell” but suffice to say it is has led me to the land of “Free Agency”.

As such over the past couple of weeks I have met several people and have been discussing items such as Big Data, Semantics, Hadoop, NoSQL, Data Science <<insert favorite bingo buzzword here>>. One issue that I see over and over concerns the usage of different distributed compute frameworks. Saying “Hadoop” is not a panacea for your Big Data problems.  One must understand the problem your trying to solve.  Many people fall prey to the “new shiny thing” paradigm.  On the issue of BigData concerns if you want to scale horizontal go with a NoSQL solution – if it is necessary.  The NoSQL database is implemented as a data grid for processing (mapReduce, queries, CRUD, etc.).  Most big websites and players that have moved towards non-relational datastores include LinkedIn, Amazon, Digg and Twitter. One of the main reasons for using NoSQL solutions is relational databases place computation on reads, which is considered wrong for large-scale web applications such as Digg etc.

Aspects of this behavior are:
• The serial nature of applications1 often waiting for I/O from the data store which does no good to scalability and low response times.
• Huge amounts of data and a high growth factor lead Twitter towards facilitating Cassandra, which is designed to operate with large scale data.
• Furthermore, operational costs of running and maintaining systems like Twitter et al escalate. Web applications of this size therefore “need a system that can grow in a more automated fashion and be highly available.”.

Amazon’s James Hamilton agrees:

“Scale-first applications are those that absolutely must scale without bound and being able to do this without restriction is much more important than more features. These applications are exemplified by very high scale web sites such as Facebook, MySpace, Gmail, Yahoo, and Amazon.com. Some of these sites actually do make use of relational databases but many do not. The common theme across all of these services is that scale is more important than features and none of them could possibly run on a single RDBMS.”

Ok so lets assume you have your high availability horizontal scale framework (HAHSF)  in place and you are harvesting, ingressing and egressing data.  Now what?  You must figure out the DataScience strategy and what to do with the data.  Otherwise its like hoarding – your just harvesting and storing it, which by definition means that you need to do some type of statistics, machine learning or data mining.  Here is where I believe most people get slightly sideways.  Map Reduction is not Hadoop.  Map Reducing of Machine Learning algorithms are at the forefront of what is occurring within scale applications.

Machine Learning Review

For a refresher I would suggest breaking out that Grey Book called Machine Learning by Professor Mitchell or Haykin’s, Neural Network tome.  So lets start with something we all probably learned in one of your AI or machine learning classes.  Remember the Back-Propagation Network?  If not here is the diagram for a refresher:

Essentially the system is a linear combination of weights with a trigonometric function that provides a threshold then the error is fed back the the weights are changed.  There are three types of neurons in a neural network that is created BP ANN algorithm:
  • Input neurons

For discrete input attributes, an input neuron typically represents a single state from the input attribute. This includes missing values, if the training data contains nulls for that attribute. A discrete input attribute that has more than two states generates one input neuron for each state, and one input neuron for a missing state, if there are any nulls in the training data. A continuous input attribute generates two input neurons: one neuron for a missing state, and one neuron for the value of the continuous attribute itself. Input neurons provide inputs to one or more hidden neurons.

  • Hidden neurons

Hidden neurons receive inputs from input neurons and provide outputs to output neurons.

  • Output neurons

Output neurons represent predictable attribute values for the data mining model. For discrete input attributes, an output neuron typically represents a single predicted state for a predictable attribute, including missing values. For example, a binary predictable attribute produces one output node that describes a missing or existing state, to indicate whether a value exists for that attribute. A Boolean column that is used as a predictable attribute generates three output neurons: one neuron for a true value, one neuron for a false value, and one neuron for a missing or existing state.

A neuron receives input from other neurons, or from other data, depending on which layer of the network it is in. An input neuron receives inputs from the original data. Hidden neurons and output neurons receive inputs from the output of other neurons in the neural network. Inputs establish relationships between neurons, and the relationships serve as a path of analysis for a specific set of cases.

Each input has a value assigned to it, called the weight, which describes the relevance or importance of that particular input to the hidden neuron or the output neuron. The greater the weight that is assigned to an input, the more relevant or important the value of that input. Weights can be negative, which implies that the input can inhibit, rather than activate, a specific neuron. The value of each input is multiplied by the weight to emphasize the importance of an input for a specific neuron. For negative weights, the effect of multiplying the value by the weight is to deemphasize the importance.

Each neuron has a simple non-linear function assigned to it, called the activation function, which describes the relevance or importance of a particular neuron to that layer of a neural network. Hidden neurons use a hyperbolic tangent function (tanh) for their activation function, whereas output neurons use a sigmoid function for activation. Both functions are nonlinear, continuous functions that allow the neural network to model nonlinear relationships between input and output neurons.

Map Reducing The Algorithm

Now here is where the world of real time systems and distributed systems collide.  Suppose we have a epoch and we have an error propagated back and we need to update the weights?  We need to map reduce this functionality to fully gain benefit from the map reduction – hadoop-it-izing magic.  As a specific note the performance that results depends intimately on the design choices underlying the MapReduce implementation, and how well those choices support the data processing pattern of the respective Machine Learning algorithm.

However, not only does Hadoop not support static reference to shared content across map tasks, as far as I know the implementation also prevented us from adding this feature. In Hadoop, each map task runs in its own Java Virtual Machine (JVM), leaving no access to a shared memory space across multiple map tasks running on the same node. Hadoop does provide some minimal support for sharing parameters in that its streaming mode allows the distribution of a file to all compute nodes (once per machine) that can be referenced by a map task. In this way, data can be shared across map tasks without incurring redundant transfer costs. However, this work-around requires that the map task be written as a separate applications, furthermore, it still requires that each map task load a distinct copy of its parameters into the heap of its JVM.  That said almost all ML learning algorithms and particularly ANN are commonly done with stochastic or direct gradient descent/ascent (depending on sign), which poses a challenge to parallelization.

The problem is that in every step of gradient ascent, the algorithm updates a common set of parameters (e.g. the update matrix for the weights in the BPANN case). When one gradient ascent step (involving one training sample) is updating W , it has to lock down this matrix, read it, compute the gradient, update W , and finally release the lock. This “lock-release” block creates a bottleneck for parallelization; thus, instead of stochastic gradient ascent many methods use a batch method which greatly slows the process down and takes away from the real time aspect.  Work arounds that I have personally seen used and help create are memory mapped shared functions (think signal processing and CPU shared memory modules) which allows a pseudo persistence to occur and the weights to be updated at and paralleled at “epoch time”.   As we reduce the latencies of the networked systems we are starting to get back to some of the very problems that plagued real time parallel signal processing systems – the disc i/o and disc read-write heads are starting to become the bottlneck.  So you move to solid state devices or massive in memory systems.  That does not solve the map reduction problem of machine learning algorithms.  That said over the past years the Apache Foundation and more specifically the Mahout project has been busy adding distributed algorithms to its arsenal.  Of particular interest is the Singular value decomposition which is the basis kernel for processes such as Latent Semantic Indexing and the like.  Even given the breadth of additions fundamental computational blocks such as Kohonen networks and the algorithm mentioned here – Backpropagation are not in the arsenal.  Some good stuff such as an experimental Hidden Markov Model are there.

That said I really do believe we will start to see more of a signal processing architecture on the web.  Just maybe “cycle-stealing” via true grids will make comeback?

Go Big Or Go Home!

@jaxsoncreole

Probablistic Databases For Predictive Content

The Other PDF To You

The Probability Something Will Happen

 

Digital Remote for Your Life

Well folks we are going to shift gears here a little and get back to some hardcore Ideas2Bank discussions concerning technologies.  As of late I have been interested once again in Finding-Not-Searching types of behaviors.  Affinity based systems are once again on the rise.  I will go so far as to say the Age of Affinity is here.  TechCrunch did a writeup recently concerning relevance.  At the end it turned into a pitch for Quora.  However it did have some good ideas concerning continuum of Personalization functionality from complete serendipity to exact personalized context aware information constructs based on geo-location.  I have always been a fan of “lean back” technologies.   Technologies that essentially enable a digital remote for your life.  These types of systems have two common themes: 1) ease of use 2) The probability of usage

Predictive Content and Probability

In today’s world we are trying to create predictive, context aware systems based on the wrong models.  For today’s database architecture: 1) An item either is in the database or is not 2) A tuple either is in the query answer or is not.   This applies to all state-of-the-art data models across the board.  In a probablistic database we have a different construct altogether which is a better fit for the flow of content. For a content prediction event driven system we can assume the events are precise and have no ambiguity. Instead, it is the future event stream that is unknown and the matching of a pattern at some point in the future is predicted with some probability.   When, Where and How are the operatives for this type of predictive event, f(Wh,Wr,H) if you will.  Also note I mentioned the word stream.   I believe given the current and future infrastructures for processing we are bringing back some of the same analogies for large array signal processing frameworks.  The probablistic database models set up extremely well for these types of event processing mechanics.

For a probabilistic database we have:

  • “An item belongs to the database” is a probabilistic event.
  • “A tuple is an answer to a query” is a probabilistic event
  • Can be extended to all data models; we discuss only probabilistic relational data

Probabilistic databases distinguish between the logical data model and the physical representation of the data much like relational databases do in the ANSI-SPARC Architecture. In probabilistic databases this is even more crucial since such databases have to represent very large numbers of possible worlds, often exponential in the size of one world (a classical database).  In complex event processing systems, events from the environment are correlated and aggregated to form higher level events. Uncertainty in the events may be due to a variety of factors including imprecision in the event sensors or generators (eg streams), and corruption of the communication channel possibly dropping events, which can be measured with entropy metrics.  These attributes lend themselves well to fusion systems and the social stream architectures.   Given we are looking at heterogenous data sources that set up for collisions and data source integrity these types of databases hold great promise.    In addition many of these types of database architectures build upon Finite State Machine mechanics for event processing in operating systems.  Of further interest the data is usually imprecise.

Probalistic Databases address types of imprecision whereas:

  • Data is precise, query answers are imprecise
  • User has limited understanding of the data
  • User has limited understanding of the schema
  • User has personal preferences

Notice a “trend” here?  This sets up very well for content flow predictions.  In addition these types of systems hold well for principled semantics for complex queries.  This provides context for the queries where the data is usually imprecise.  Data integration and data hygiene are paramount in social stream systems.  Where data accuracy is important most companies spend 85% of workload cleansing data.  We could use probabilistic information to reason about soundness, completeness, and overlap of sources (think linked data here).  I have listed some of the main sources of research in Probabilistic databases herewith.  As far as I know there are no publicly commercial applications as of yet for this technology.  My bet is we will see some very soon integrated with some of the other NoSQL like technologies.

For a list of current research projects see:

Until Then,

Go Big Or Go Home!

@jaxsoncreole

A Couple of Lessons – Responsibility For The Responsible

Responsibility For The Responsible!

Hello all:

I trust this finds everyone well.  We have all the usual worldwide things going on: disasters, wars etc and then in the valley we have huge funding events happening e.g. Color getting $41M. I was thinking ’bout some lessons lived and learned whilst reflecting on past lives and a couple of items came out of the “wet-ware” subconscious: 1) Do The Right Thing 2) Make Decisions Like Your Paycheck Is Irrelevant. (style note: after all these years of Mac programming and working at Apple I still use object style capitalization).  Whilst these item do not appear to be tech worthy or money making – someone said I should teach a Tech Psychology 101 class.

Do The Right Thing

A very successful person in the tech industry who at this point I consider close to iconoclastic said,  “Just Do The Right Thing.”  I am not going to get into the metaphysical aspects of Right/Wrong here suffice to say what this person was making a comment on was have a clear conscience and do not be biased as to the correct technical and business decision.  In the long run you will benefit and you can always look back and know that what you did as a technical professional was well and just.  This was coming during a very complicated and involved several people with agendas.  For those who are not in the tech industry you would be amazed at how emotional coders, architects and program directors are with respect to some of these decisions.  Then I think that nowadays they are very close to rockstars so what not be emotional?  That said when you have a difficult decision to make “Do The Right Thing.”

Make Decisions Like Your PayCheck Is Irrelevant

Once again someone well known in the industry was walking with me at a conference trade show and due to the level of their success and the amount of well tech power they possessed I used this time to ask questions.  I asked them what is consistent in your success?  Without missing a beat this person said, “Work like your paycheck doesn’t matter.”  Of course your probably saying if this person is that successful they can afford it.  Well dear readers they were not talking about money they were talking about reputation.  Money is temporary.  Reputation lasts a long time – possibly until the big sleep.  It took me a while to truly get my head around this because well time is money and money is time.  Yet the more I pondered this statement the more it made sense (and cents).  Time went on and I started espousing some of these tenets.  Eventually I came upon a situation where I knew one of my colleagues was going into a meeting that could change the face of I/O connectors (think USB and FireWire wars) and I told him say what you know and operate like you do not need a paycheck.  He came out of the meeting and said you know I will always operate in that manner.  It completely freed me from figuring out what to do in high pressure situations.

Responsibility For The Responsible!

First and foremost one thing I do wish is I could have met the person who espoused this statement.  I actually came very close when I was in The Valley circa 93′.  Yet that is a story for another time and place.  The two previous items roll up into this tenets.  For those that are changing the world via technology we have the power to create and destroy.  For some this is the game because well they have made millions and billions, just as the artist who can create to destroy which is part of the art.  Yet even though we create technologies that change the very social fabric of humanity we are still dealing with the human form on a daily basis (caveat emptor for better or worse) and as such we have decisions that will live with us. Conflict is a given.  It is going to happen.  Humans adore war.  Many a people in the tech industry have destroyed many an art – books could be written on the amount of code and product that hit the trash and <delete>.  Part and parcel of this is the aspect of doing what your told instead of being responsible and doing the right thing.  Software is the most scalable industry period.  There may never be another industry as scalable (smart grid maybe but not yet).  The multi-cast nature of changing entire groups of people via one application or a widget is phenomenal not to mention the monetary reward.  As I am fond of saying – Idea2Bank as fast as possible yet this comes with some aspect of Responsibility For The Responsible.  There are those that can and those that cannot its a very stratifying industry.

At the end of the day do what you know – say what you think – create the software you know is true.  Software only knows brutal honesty – either it works or it doesnt and you should be a reflection of that creation.

Until Then,

Go Big Or Go Home!

@jaxsoncreole

A Couple of Lessons – Responsibility For The Responsible

Responsibility For The Responsible!

Hello all:

I trust this finds everyone well.  We have all the usual worldwide things going on: disasters, wars etc and then in the valley we have huge funding events happening e.g. Color getting $41M. I was thinking ’bout some lessons lived and learned whilst reflecting on past lives and a couple of items came out of the “wet-ware” subconscious: 1) Do The Right Thing 2) Make Decisions Like Your Paycheck Is Irrelevant. (style note: after all these years of Mac programming and working at Apple I still use object style capitalization).  Whilst these item do not appear to be tech worthy or money making – someone said I should teach a Tech Psychology 101 class.

Do The Right Thing

A very successful person in the tech industry who at this point I consider close to iconoclastic said,  “Just Do The Right Thing.”  I am not going to get into the metaphysical aspects of Right/Wrong here suffice to say what this person was making a comment on was have a clear conscience and do not be biased as to the correct technical and business decision.  In the long run you will benefit and you can always look back and know that what you did as a technical professional was well and just.  This was coming during a very complicated and involved several people with agendas.  For those who are not in the tech industry you would be amazed at how emotional coders, architects and program directors are with respect to some of these decisions.  Then I think that nowadays they are very close to rockstars so what not be emotional?  That said when you have a difficult decision to make “Do The Right Thing.”

Make Decisions Like Your PayCheck Is Irrelevant

Once again someone well known in the industry was walking with me at a conference trade show and due to the level of their success and the amount of well tech power they possessed I used this time to ask questions.  I asked them what is consistent in your success?  Without missing a beat this person said, “Work like your paycheck doesn’t matter.”  Of course your probably saying if this person is that successful they can afford it.  Well dear readers they were not talking about money they were talking about reputation.  Money is temporary.  Reputation lasts a long time – possibly until the big sleep.  It took me a while to truly get my head around this because well time is money and money is time.  Yet the more I pondered this statement the more it made sense (and cents).  Time went on and I started espousing some of these tenets.  Eventually I came upon a situation where I knew one of my colleagues was going into a meeting that could change the face of I/O connectors (think USB and FireWire wars) and I told him say what you know and operate like you do not need a paycheck.  He came out of the meeting and said you know I will always operate in that manner.  It completely freed me from figuring out what to do in high pressure situations.

Responsibility For The Responsible!

First and foremost one thing I do wish is I could have met the person who espoused this statement.  I actually came very close when I was in The Valley circa 93′.  Yet that is a story for another time and place.  The two previous items roll up into this tenets.  For those that are changing the world via technology we have the power to create and destroy.  For some this is the game because well they have made millions and billions, just as the artist who can create to destroy which is part of the art.  Yet even though we create technologies that change the very social fabric of humanity we are still dealing with the human form on a daily basis (caveat emptor for better or worse) and as such we have decisions that will live with us. Conflict is a given.  It is going to happen.  Humans adore war.  Many a people in the tech industry have destroyed many an art – books could be written on the amount of code and product that hit the trash and <delete>.  Part and parcel of this is the aspect of doing what your told instead of being responsible and doing the right thing.  Software is the most scalable industry period.  There may never be another industry as scalable (smart grid maybe but not yet).  The multi-cast nature of changing entire groups of people via one application or a widget is phenomenal not to mention the monetary reward.  As I am fond of saying – Idea2Bank as fast as possible yet this comes with some aspect of Responsibility For The Responsible.  There are those that can and those that cannot its a very stratifying industry.

At the end of the day do what you know – say what you think – create the software you know is true.  Software only knows brutal honesty – either it works or it doesnt and you should be a reflection of that creation.

Until Then,

Go Big Or Go Home!

@jaxsoncreole

Reprise Again – The First “T” of a StartUp

Recently Mark Suster presented a writeup at TechCrunch here: Whom Should You Hire? Of interest is the following extraction:

If you’re doing a great job at continually recruiting and if you have a company ready to hire several people, at some point when you have enough of a pipeline of talented people you need a way to separate them. I have a long-standing mantra, “attitude over aptitude.” This is assuming a raw minimum of MIPS in the candidate. They need to be seriously smart / talented in their field to make the minimum grade.

But within this “minimum acceptable talent level” you still have a wide variance of “employee types.” Let’s be honest – some uber talented people are PITAs. I never hire them. One bad apple spoils things for everybody.

You don’t see it coming. You figure, “sure, they’re a pain but they produce such high quality work I’m willing to put up with them.” Don’t. The last thing you need is some rat bastard fomenting trouble.

They’re the ones who are talking pop at cocktail parties when they’ve had one too many. They’re having private lunches with other employees talking about how they’ve lost faith in your vision.

When you hit internal moments of doubt you need the team members who say, “Guys, we can do this! We’re up against the ropes but we’re not down. Let’s dig in.” You need team members who do that when you’re NOT there.”

Truer words were never spoken.  We haven’t really changed much in our habits via our short time of evolution here on Earth.  Our survival instincts kick in and usually people freak out and do weird stuff.  I could write tomes on the things I have seen people do within times of tech world crisisdom.  There is an old adage “Do Not Panic and You Will Live.”  Most panic and freak out – but those that adapt and hang in there usually win out.  I wrote sometime ago about the The Three “T”s of a StartUp.  I also wrote sometime ago about the companies bidding on talent way before Zuckerburg bid on twitter for teams: Revisiting The Three T’s. I then wrote about the obvious talent shortage and how how we are going to see a swarming affect of teams much like the days of clans in Quake: Mercanaries For Hire.

Yet even with the best of the best of the best.  One thing still stands true: At some point the attitude does over shadow that compiler count.

Go Big Or Go Home.

@jaxsoncreole

StartUp Documents and Agreements

Agreements.  Yes you need them.  In fact you need many types but you do not have to pay an arm and a leg to a lawyer for them.  For some this will be rather pedantic yet I will add a nuance or two concerning these matters so maybe you will pick something up.  Remember even experts should do the basics over and over just to keep sharp.

The reason I am writing about this seemingly pedantic and rather boring subject is that many companies and people take HUGE amounts of valuable time and money going over and over and over these documents.  Yes you want to get the basics down but a startup SHOULD NEVER SPEND EXORBITANT  MONIES AND TIME ON THESE DOCUMENTS.  If your dealing with someone who is being overly difficult concerning getting all of the minutiae detailed up front and perfect – get rid of them and get someone else – unless they are writing a really big check!  I have seen several companies spend two much time working on these documents when they are essentially free.  TheFunded (http://www.thefunded.com/) has a great set of documents that were gifted by Wilson Sonsini Goodrich and Rosati.  Don’t know who they are? Do some homework.

Here is a short list:

  • C-Corp Filing: Ok for some of you I can hear the groans or rebuttal.  Yes YouTube was a special case of and LLC.  Nowadays I am asked if companies are Del C Corp.  So please get the 250.00 or so put 2000.00 in and create your 20M shares.
  • Non-Disclosure Agreement – basically says you wont tell the world everything about the latest NewCorp.  A gentleman’s agreement if you will.   I would cap it at 3 years which actually should be 1 year but hey just sign it and get talking.
  • Offer Letters: Do not make it overly complicated.  State the basic work items of what you expect, meeting granularity, percentage equity stake, strike price based on current valuation, vesting schedule and retainer if any.  For employees also list salary if applicable
  • One Pager”: This discusses the company in a snapshot and allows you to quickly intro your company.  Here is what you want in the one pager: Who are the Founders, Industry, Business, Accountants, Current Investors, Your Ask in Dollars, Use of funds, Number of Employees, Clients, Exit Strategy, Contact information, Summary of Company, Market, Products, Company Management, Board Members (list companies they worked for and advise) and your Logo with address
  • Convertible Note: This eases the pain of raising seed and angel funding.  I am not going into the specifics here but suffice to say there are ways for debt financing and equity financing that can be clean no muss no fuss. Sign seal and get to coding.
  • CAP Table: This is who has what and at what price.  Important for raises.  It should be a very simple excel spreadsheet allowing you or others to put in raise amounts and compute dilution as well as percentages.  This also lists founder, restricted and common stock issuance
  • Your Deck: The pitch deck.  Know it love it and recite it in your sleep
  • Business Plan: What do those forecast look like and why are you going to take on Google, Apple, Microsoft and Facebook?
  • Stock Plan: This is usually referenced in the offer letters so at least have some version so the SEC wont freak out when you file your raise with them!

I have found that some people will try and get fancy with these documents because well honeslty they do not know any better and they want to appear smart.  Please if your faced with any of these documents just say you dont know what such and such is and get an answer instead of negotiating around the points.  Also just ask for what you want.  Say I want to have 10% restricted shares that vest immediately upon me hitting such and such milestone.  Especially if you’re a founder or a coder.  You have the ideas and the coding ability so you control the show.  I would recommend one item at the very least is ask for restricted stock for tax consequences.  If you’re a founder you should have founders shares.

So main point is that the document are out there and you don’t have to spend a ton of money.

My other point is if you run into someone that wants to grind on these documents instead of working them out in concurrence with taking your Idea To The Bank then tell them you don’t have  time and need to get someone who wants to start creating.

I would love to hear from others out there or any comments and questions.

Until Then,

GO BIG OR GO HOME!

The South Should Follow Suit

Well leave it up to the crew on the left coast to create something –  well overwhelming.  Muri Milner and Ron Conway are teaming together to offer EVERY Y-Combinator startup – 40 of them – all of them – investment money.  Here are some of the terms

They haven’t even seen most of the startups yet. This is a bet on the quality of Y Combinator startups in general.

All of the new Y Combinator entrepreneurs gathered at Y Combinator headquarters in Mountain View California on Friday evening to hear about the offer, They weren’t told why they were supposed to be there, just that something important was happening. The SV Angel team was there in person. Milner joined from Europe by video conference.

The terms? $150,000 in convertible debt. With no cap and no discount. If you’re an investor you know exactly what that means and you just shuddered a little. Those aren’t terms that most angels can match.

Ok I can hear it now.  Well they have had a head start they can afford it.  Guess what I am tired of that parrot talk.  You know what?  There is some of the oldest and most money in the South and North for that matter.  Let me change this – EAST COAST should follow suit.  Yet they will not.  its too compartmentalized over here on the “right coast”.  Let me illustrate:

1. New York – Ad mafia folks.  Good money but once again always looking for ad related tech.

2. Boston – if your across the river from MIT forget it.

3. Washington, DC. – contractor and skunkwerks

4. Atlanta, GA – trying to be the epicenter of the SE

and really that is about it.  How do I know?  Over the years I have probably been in direct contact with over 90 VCs ranging from so called seed to growth funding. Really the biggest and best investment I have seen lately come out of the right coast is from Brian O’Kelly’s shop AppNexus and rightfully so yet think about the magnitude of what Milner and Conway are going to create.  I know for a fact and will not illustrate here that there is a ton of money on the right coast.  Further there is a ton of money in the South.  Couple that with our quality life and Charleston, SC could be an epicenter for technology.  Yet a long time ago in a land faraway I was criticized for saying that the Seed money on the east coast is like Series A on the west coast as far as timelines.

I am worried that the left coast is going to come get all the talent on the right coast and especially in The South and call it INSHORING.

Maybe I’ll save that for another blog.

Words That Do Not Mean Anything Vol. #2

Well low and behold we have had input from the B L O G O S P H E R E (wherever the hell that is…) kinda like the great jukebox in the sky… I digress.  Gotta focus get things accomplished.  Busy is as Busy does you know… In any event we have a follow on, an incremental build if you will of the previous blog.  See Words That Do Not Mean Anything from my previous blog.  So had some input and little did I know that people are really pissed about this issue.  They are fed up!.  Yet they still fall into the same process and entrain into the local linguistics.  Monkey See – Monkey Do…  Without much ado about nothing here is the latest list…ah the collaborative nature of the open source world.  A list of phrases and worlds that do not mean anything a.k.a. Words That Don’t Mean Anything Volume # 2.

Timely

tim·ing (t-i-mng)

n. The regulation of occurrence, pace, or coordination to achieve a desired effect, as in music, the theater, athletics, or mechanics.

This in most cases, means that you want something done immediately but cannot justify telling the person that you do not really feel like doing it so you turn it over to them and tell them you need it in a “timely” manner.

Swimingly

swim·ming·ly (swmng-lee)

adv. With great ease and success

It is my understanding that this term had deeper contextual meaning as well as historical meaning but for the life of me I could never hear myself saying, “Its going swimmingly”.  If anyone has ever done any real swimming you will know why this is true.

Absolutely

ab·so·lute·ly (ahb-so-lute-lee)

adv. Definitely and completely; unquestionably.

An attempt at pseudo-passion when really you could care less.  Then they completely forget about it after the meeting.  Nice act.  Keep up the good work..

Effective

ef·fec·tive (e-fktv)

adj. Having an intended or expected effect.

I am going to shift to some phrases that I have heard over the years. Yes I know this is supposed to be WORDS but I control the information.

Great, great, glad to hear it.

Scenario: Teleconference.

Person1: “Hey how are you?”

Person2: “Busy, really busy.”

Person1: “Great Great, glad to hear it.”

Person1 could care less.  It could have been much like this:

Scenario: Teleconference.

Person1: “Hey how are you?”

Person2: “Just came back from your house with your wife.”

Person1: “Great Great, glad to hear it.”


Add some color for me

Decoded: I have no clue what is going on tell me exactly why I am even here and what to do…

It’s coming together

Decoded: I hope this thing ships or my ass is done for…

It Just Works

Come on people!  THIS IS NOT A REQUIREMENT! OF COURSE THE THING SHOULD WORK!

Scenario: Requirements Meeting

Person1: “Well I have this idea about using Hive to speed up our data analysis.”

Person2: “Great, Great, glad to hear it.  So whats one of the design requirements?”

Person1: “It should just work.”

So here we go…

I need to have these requirements finished in a timely manner such that it adds some color around an effective design so that the coding will proceed swimingly. One of the main requirements is that It Just Works. Think we can get it done?  Absolutely! I think its coming together!

Until then,

Go Big Or Go Home.

Words That Do Not Mean Anything

Sometimes It Helps To Look At Ourselves

Over the last ten years or so, aside from the millions of acronyms that have been created, misused, and abused, there is this growing trend in using words that are what I call open ended – left to the receiver to decide the importance or lack there of.  In fact these are now over-used and should be stamped out from and eviscerated from technical jargon and meetings.  Here is a short list i would love the dear readers to send me some more:

Interesting

in·ter·est·ing  (en-ter-es-sting)

def: adj: arousing curiosity or interest; holding or catching the attention.

A word that “sounds” intelligent.  Is it?  I contend people say it because they do not have any idea of what really to say or they are too scared to say what they are really thinking. Nowadays in this world of the information age, people get paid to think.  They get paid for what they know which turns into what they do.  Yet sitting there and saying “Interesting” doesn’t do me a damn thing other than waste mine and other people’s time.  If you are going to say something – say it.  Otherwise be quiet.  Sorry your not holding my attention. Interesting isn’t it

Scenario

scen-ar-i-o [si-nair-ee-oh, -nahr-]

noun: a written outline of a movie, novel, or stage work giving details of the plot and individual scenes.

I LOVE this term.  Note item #3 IMAGINED… well boy howdy there have been some imagined events.  Strategic Scenarios sure do not materialize in money near term do they?.  They are however extremely important for planning.  That said SCENARIO is a term that is being used to mean “Well Really I want to sound like I know what I am doing but I don’t have a clue – really nor do I have any real ideas.”

Synergy

syn·er·gy  (snr-j)

adjective: attributing to a subject something determined by observation rather than analysis of the nature of the subject and not resulting in self-contradiction if negated

Cooperation boys and girls!  “Oh I see it as a Synergism between the two departments.” Decoded:  I am going to take all of your projects and plans away.

or

“I see your investment synergistic to our needs.” Decoded: Please give us the money even it if means a low valuation with a 51% ownership!

Clarity

clar·i·ty  (klr-t)

noun: the state of having a full, detailed, and orderly mental grasp of something

How many times have you heard this? “I just want to be CLEAR.”  People also use the word transparent.  This is like a human resource catch all term that basically means I have no idea why i said what i said and put the company at risk.  In other uses in means I am trying to explain something that I do not understand.

Win-Win

win-win (wnwn)

adjective: advantageous or satisfactory to all parties involved

I mean really?  Cmon’.  There will always be a winner and a loser.  Sorry you dont get a trophy, the contract or the bonus or venture capital money.  See my blog on Social Darwinism and Software Development.

Woo Hoo!

Woo hoo is a common expression of joy, especially as arising from success or good fortune.

Instead of saying Hell Yea that was incredible people say it as a half assed attempt at enthusiasm in the workplace.  They also use it at the end of a ski run in a very tight oppressed manner when they really want to yell.

Blog

blog (a blend of the term web log) is a type of website or part of a website

Never have liked that word.  Sounds like your vomiting.  Then again most people just whine and say nothing of importance.  Blogging about Blogging Blogs.

So here we go: I want to create an Scenario that is Interesting with Synergy that is Clear and creates a Win Win for everyone they can Blog about.  Woo Hoo!

Until Then,

Go Big Or Go Home.