Inspirational stimuli are known to be effective in supporting ideation during the design process. However, minimal prior work has allowed individuals to search using multiple modes of input simultaneously, which is more representative of real design behavior. In the current work, we developed a multi-modal search platform that retrieves 3D model parts based on text, appearance, and function-based search inputs. This work presents the results of an experimental study (n = 21) in which the search platform was used to find parts identified as potentially useful for inspiring solutions to a design challenge. Participants were asked to engage with three different search modalities: search by keywords, by curated 3D parts, and by user-assembled 3D parts in their workspace. When searching by parts that are curated or in their workspace, additional control over the similarity of appearance and function of results in reference to the input was available to participants. The results of this study demonstrate that the modality used affects search behavior, such as in the frequency of searches, how participants engage with retrieved search results, and how broadly the search space is covered. Specific results link interactions with the interface to search strategies participants may have used during the task. Findings suggest that multi-modal search should enable intentional search for desired goals through direct search inputs (e.g., by keyword) and incremental adjustments to features of visually represented search inputs. Moreover, enabling discovery of inexplicitly searched for examples through related information or more randomly encountered examples may assist exploratory search behavior.