Visual search is an integral and complicated component of most human-computer interaction (HCI) tasks. Most computer tasks involve visual search. Further, many design decisions can affect visual search. For example, listing menu items alphabetically or grouped by function can reduce the number of items the user needs to search through when compared to arbitrarily ordered items. Common or well-understood terms for web page link labels can increase the chance that users choose the correct links. While a large number of user interface design guidelines exist for web and application development, many design guidelines are not well supported by research or are only loosely based on related research. In this talk, I will discuss recent, ongoing work investigating the effects of semantic and perceptual grouping on the visual search of text-based layouts. An experiment was ran in which the semantic-cohesion and labeling of groups were varied. I will discuss results discovered so far based on reaction time data. In addition, I will discuss the directions I am pursuing with the analysis of eye movement data collected from the experiment. Specifically, I will discuss the application of local sequence alignment to the analysis of scanpaths (the sequence of locations fixated during visual search) and error correction in eye movement data.