The first development phase of a Space Mission consists of the Space System Concept Studies, in which system concepts are broadly defined, as a set of feasible System Conceptual Solutions to accomplish the mission needs. Nowadays, this phase involves the practices of System Engineering (SE) and Concurrent Engineering (CE), which respectively: (i) organizes the systems investigation/documentation methodology, and (ii) speed-up the process into parallelization of disciplines studies and successions of convergence sessions (CE Sessions). The CE Sessions for Space System Concept Studies are highly interactive activities, which require: (i) specialists of a given discipline (thermal, operation, electrical, etc.) to describe to the team their System Element solution models, showing its parts and required parameters, and (ii) facilities to handle the CE activities streamlining the work toward the System Concept Solutions. Either in document-centric or model-centric approaches, the model collaborations occur by: projection of the models, within a sequential order, and accordingly to the number of the projectors, which shows the discipline’s models. This virtualization of information undermined the physical collaboration through artefacts, in preference to virtual-only metaphor collaborations, where, for instance, a person bringing together two physical peons means that a new representation is obtained, was replaced by drag-n-drop tree branches in Graphical User Interface (GUI). This thesis proposes and demonstrates the viability to use Tangible User Interfaces (TUI) constructed with physical electronic artefacts and Spatial Augmented Reality to reintroduce tangible collaboration into CE Session. A tangible interaction vocabulary was defined in order to use real artefacts to control CE data. In a pragmatic aspect for the Space Engineering sector, this thesis brings cognitive aid tools back to the design workspace.
Keywords: Tangible User Interface. Concurrent Engineering. Model Based System Engineering. Space Systems Concept Design. Collaborative Environments.
Doctorate Thesis of the Graduate Course in Space Engineering and Technology/Space Systems of Management and Engineering, guided by Drs. Ana Maria Ambrosio, and Claudio Kirner, approved in February 28, 2018.
“With the passage of time, the psychology of people stays the same, but the tools and objects in the world change. Cultures change. Technologies change. The principles of design still hold, but the way they get applied needs to be modified to account for new activities, new technologies, new methods of communication and interaction”. Don Norman in “The Design of Everyday Things”, 2013
Space Systems development requires understanding of several modelling and simulation resources to accomplish space mission goals. The largest and more complex simulator developed into the space mission context is the operational simulator, which aims to support all operational activities before, during and after launch. This work proposes a user interface based on virtual reality to satellite operational simulator, proposing a 3D interactive visualization with multiple views, 3D models, conic views and contextualized panels techniques to navigate through simulated data. The user interface uses third generation interaction concepts instead of the windows, icon, menu and pointer metaphor (called WIMP metaphor - Windows, Icon, Menu and Pointer ). A simple operational simulator to illustrate and evaluate the user-interface was developed with simplified behaviour and orbit propagation. The proposed user interface was evaluated by an usability questionnaire with experienced INPE users. The results show that the the interactive techniques applied were well accepted by different types of users. The tests, expansion and reuse of this work are discussed in this essay.
Dissertação de Mestrado do Curso de Pós-Graduação em Engenharia e Tecnologia Espaciais/Engenharia e Gerenciamento de Sistemas Espaciais, orientada pelos Drs. Walter Abrahão dos Santos, e Ana Maria Ambrosio, aprovada em 17 de fevereiro de 2014.
“The product is no longer the basis of value. The experience is“. Venkat Ramaswamy
basAR is an Augmented Reality Environment to create and use applications where behavior can be programmed to enhance the augmented experience. Some application examples are puzzles, building guides, schooling, games, etc. basAR applications are based on a concept of action points. Action points are the center of reactive zones from the Structure Layer that are placed relatively of Infrastructure Layer object, in the base Marker. The Actuator Layer object, the Actuator Marker, also has an action point. The virtual collision between a structure action point and an actuator action point, or structure action point and another structure action point carried by the actuator action point provokes a reaction. On basAR the collision reaction or collision feedbacks can be dynamically configured on the authoring mode.