Techniques developed and assumptions made during the early years of the genomic revolution can be stretched to the breaking point in the current era of big data. Two methods, phylogenetic profiling and shotgun sequencing assembly, work well on what we would now consider "small" datasets, but become intractable and/or unreliable when many genomes are considered simultaneously. These examples will be used to illustrate how choices of methods made when it was possible to consider all of the data can handcuff expansion of those techniques when that is no longer practical. It has been said that the key to success in bioinformatics is knowing what data can be ignored. Partial Phylogenetic Profiling (PPP) and mini-HMM analysis of metagenomes are two techniques that sidestep the bottlenecks of processing big data by discarding foundational assumptions that are no longer tenable. PPP allows researchers to identify proteins that work together in many genomes without first defining a global set of protein families. Mini-HMMs allow the analysis of hundreds of genomes simultaneously based on known marker genes without first having to correctly assemble all of the shotgun data.