In this paper, we demonstrate photosensitive polyimide (PSPI) profile optimization to effectively reduce stress concentrations and enable PSPI as protection package-induced stress. Through detailed package simulation, we demonstrate ~45% reduction in stress as the sidewall angle (SWA) of PSPI is increased from 45 to 80 degrees in Cu pillar package types. Through modulation of coating and develop multi-step baking temperature and time, as well as dose energy and post litho surface treatments, we demonstrate a method for reliably obtaining PSPI sidewall angle >75 degree. Additionally, we experimentally validate the simulation findings that PSPI sidewall angle impacts chip package interaction (CPI). Finally, we conclude this paper with PSPI material and tool qualification requirements for future technology node based on current challenges.
Today, assistive robots are being introduced into human environments at an increasing rate. Human environments are highly cluttered and dynamic, making it difficult to foresee all necessary capabilities and pre-program all desirable future skills of the robot. One approach to increase robot performance is semi-autonomous operation, allowing users to intervene and guide the robot through difficult tasks. To this end, robots need intuitive Human-Machine Interfaces (HMIs) that support fine motion control without overwhelming the operator. In this study we evaluate the performance of several interfaces that balance autonomy and teleoperation of a mobile manipulator for accomplishing several household tasks. Our proposed HMI framework includes teleoperation devices such as a tablet, as well as physical interfaces in the form of piezoresistive pressure sensor arrays. Mobile manipulation experiments were performed with a sensorized KUKA youBot, an omnidirectional platform with a 5 degrees of freedom (DOF) arm. The pick and place tasks involved navigation and manipulation of objects in household environments. Performance metrics included time for task completion and position accuracy.
Robots are starting to transition from the confines of the manufacturing floor to homes, schools, hospitals, and highly dynamic environments. As, a result, it is impossible to foresee all the probable operational situations of robots, and preprogram the robot behavior in those situations. Among human-robot interaction technologies, haptic communication is an intuitive physical interaction method that can help define operational behaviors for robots cooperating with humans. Multimodal robotic skin with distributed sensors can help robots increase perception capabilities of their surrounding environments. <p> </p>Electro-Hydro-Dynamic (EHD) printing is a flexible multi-modal sensor fabrication method because of its direct printing capability of a wide range of materials onto substrates with non-uniform topographies. In past work we designed interdigitated comb electrodes as a sensing element and printed piezoresistive strain sensors using customized EHD printable PEDOT:PSS based inks. We formulated a PEDOT:PSS derivative ink, by mixing PEDOT:PSS and DMSO. Bending induced characterization tests of prototyped sensors showed high sensitivity and sufficient stability. <p> </p>In this paper, we describe SkinCells, robot skin sensor arrays integrated with electronic modules. 4x4 EHD-printed arrays of strain sensors was packaged onto Kapton sheets and silicone encapsulant and interconnected to a custom electronic module that consists of a microcontroller, Wheatstone bridge with adjustable digital potentiometer, multiplexer, and serial communication unit. Thus, SkinCell’s electronics can be used for signal acquisition, conditioning, and networking between sensor modules. Several SkinCells were loaded with controlled pressure, temperature and humidity testing apparatuses, and testing results are reported in this paper.
<p> In recent years, advancements in computer vision, motion planning, task-oriented algorithms, and the availability and cost reduction of sensors, have opened the doors to affordable autonomous robots tailored to assist individual humans. One of the main tasks for a personal robot is to provide intuitive and non-intrusive assistance when requested by the user. However, some base robotic platforms can’t perform autonomous tasks or allow general users operate them due to complex controls. Most users expect a robot to have an intuitive interface that allows them to directly control the platform as well as give them access to some level of autonomous tasks. We aim to introduce this level of intuitive control and autonomous task into teleoperated robotics. </p> <p> This paper proposes a simple sensor-based HMI framework in which a base teleoperated robotic platform is sensorized allowing for basic levels of autonomous tasks as well as provides a foundation for the use of new intuitive interfaces. Multiple forms of HMI’s (Human-Machine Interfaces) are presented and software architecture is proposed. As test cases for the framework, manipulation experiments were performed on a sensorized KUKA YouBot® platform, mobility experiments were performed on a LABO-3 Neptune platform and Nexus 10 tablet was used with multiple users in order to examine the robots ability to adapt to its environment and to its user. </p>