Abstract

This paper proposes a novel finger kinematic model for human hand configurations, which applies to the realization of a naturalistic human finger motion for robotic finger systems and artificial hands. The proposed finger model is derived based on the geometry of a hand shape grasping a virtual cylindrical object. The model is capable of describing the natural rotation configuration of the joints of a long finger with three degrees of freedom by a single parameter, i.e., the radius of a cylindrical object. Experimental validation of the model shows that it can simulate closely naturalistic human finger movements. With the use of the proposed model, discussions were made on how to achieve multifinger coordination that makes task-specific hand movements or a posture for specific hand actions. Due to the simplicity of the model to define joints angle configuration in a long finger by a single parameter, the combination of the proposed model and the multifinger coordination concept discussed can be seen as an inclusive framework in human-like hand systems design and control. This paper is the first step toward exploring future novel combined design–control strategies for the development of under-actuated prosthetic and powered orthotic devices for the naturalistic motion that are based on both Cartesian space trajectory tracking and joint angle coordination.

1 Introduction

The development of human finger/hand modeling has attracted much interest, and its research has widely been done since they promise a variety of practical applications in medicine and rehabilitation, prosthetics, artificial hands, robotics, man–machine interface device, human–computer interaction, and so on. The success of human hand modeling in these areas relies strongly on an understanding of anatomical/functional aspects of the hand, kinematic/dynamic properties of the hand, and the purposes of the use of models. A variety of research on finger/hand modeling in various points of view has been made in human fingers simulation [1], finger motion coordination [2,3], virtual hand modeling and simulation [47], articulated human hand [8,9], and so on.

One of the main functions of the hand models is to simulate object grasping which is realized by the hand coordination which is the combination of finger movements encompassing the interjoint coordination of each finger and the interfinger coordination of the hand. The implementation of the multifinger coordination [10] in models can be a pivotal direction of modeling. It is noted that although the advantages and their modeling directions of the previous researches are best understood, this work focuses on the establishment of a novel control-oriented finger kinematic model to be used in future combined design–control strategies. The work aims to provide a finger kinematic model capable of describing the natural finger joint coordination by a single parameter. The proposed finger kinematic model is based on a hand shape in an open configuration with each finger being in some degrees of flexion. It is assumed that the form of the hand under consideration can be observed when it is power/in-hand grasping a cylindrical object. In this case, the object physically exists. However, the existence of a physical object will not be required in the process of precision grasping, which is characterized by grasping and holding an object by the fingertips. In this case, the fingers encompass a virtual cylinder. In both cases, the fingers are in some degree of flexion. The resulting model is applicable to control systems design for robotic and artificial hands with a potential to replicate multifinger coordination of the human hand. The degrees of flexion of a finger can be determined by the joint rotation angles in each finger, which angles can be determined by the radius of the real/virtual cylindrical object being assumed to be grasped. With the finger model, further discussion on multifinger coordination is made.

Here, we would like to note that the main contribution of the work is to provide a kinematic finger model that formularizes the naturalistic hand gesture during flexion or extension as closely as possible. Furthermore, the proposed model can determine finger joint rotation configuration by a single control parameter even though three settings are required to define its finger gesture. A single parameterized control would benefit in the field of under-actuated artificial robotic hand design and controls [11]. In addition, the ability to predict the most probable joint angle trajectories in order to obtain a suitable wearable mechanism for limb rehabilitation for grasping, walking, or reaching motions is very important. Existing methods require the derivation of specific parameters, such as segmental motions, forces, velocities, cycle periods, etc., or using initial Cartesian trajectories, which may be difficult to find for individuals with limited limb mobility. Therefore, there is a need for developing novel models that can predict the joint trajectories from the subjects’ anthropometric parameters.

This paper is organized as follows. In Sec.  2, a geometry-based finger kinematic model is presented that can determine the joint angle configuration of the index finger. It is shown that the proposed model is a one degree-of-freedom (DOF) model that calculates rotation angles of three joints each with a single control parameter. Section 3 describes the multifinger coordination function using the proposed model. Section 4 summarizes the work. Appendix  A extends the proposed model, which can be directly applied to the middle finger, to the ring and little finger. Appendix  B investigates a kinematic finger model using an elliptical object.

2 Joint Angle Configuration Model

2.1 Basic Assumptions.

One possible shape of a human hand is in an open configuration with each of the fingers being in some degree of flexion subconsciously around its corresponding joints. This can be assumed to be a “naturalistic” shape observed in the human hand. Human long fingers (index, middle, ring, and pinky fingers) have three joints each, i.e., metacarpophalangeal (MCP) joint, proximal interphalangeal (PIP) joint, and distal interphalangeal (DIP) joint. Each finger has four DOF (flexion/extension in DIP, flexion/extension in PIP, and flexion/extension and abduction/adduction in the MCP joint) [12]. For simplicity, in the present work, it is assumed that only the flexion/extension motion in the MCP joint is considered, which leads to a planar finger model with three degrees of freedom. In what follows, a mathematical model for depicting joint rotation coordination of a finger is proposed for naturalistic shape representation, followed by a multifinger coordination scheme for object manipulation, discussed in Sec. 4.

2.2 Joint Configuration Model for Index Finger.

In order to develop a mathematical model of the finger joint motion for the naturalistic shape of a hand, two types of the real/virtual object being grasped are considered: cylinder and cylindroid (a cylinder with an elliptical cross section). A joint configuration model using a cylinder is discussed in this section (for the index finger) and in Appendix  A (for the third to fifth fingers), and a model using a cylindroid (for the index finger) can be seen in Appendix  B.

The notion adopted to develop joint configuration models is based on the assumption that the posture of a finger which softly encompasses the surface of a virtual cylinder is similar to the one of the finger observed in the naturalistic hand defined in Fig. 1. With this in mind, the joint angle configuration of a finger is drawn first. The schematic of the index finger joint configuration is shown in Fig. 1. Based on the assumptions made above and the geometry of the index finger in Fig. 1, the joint rotation angles can be obtained by using trigonometric formulas characterizing plane triangles as shown in Fig. 2. For the semiperimeter s = (a + b + c)/2, trigonometric angle formulas are given as follows [13]:
Fig. 1
Schematic of index finger and thumb joint configurations encompassing a virtual cylinder (planar motion is assumed). Note: R is the radius of a virtual cylinder object; JiI, i = 1, 2, 3, are the MCP, PIP, and DIP joints, respectively; Li is the length of phalanges of each finger; θi is the joint rotation angle at each joint; and tiI is the width from the surface to skeleton (joint).
Fig. 1
Schematic of index finger and thumb joint configurations encompassing a virtual cylinder (planar motion is assumed). Note: R is the radius of a virtual cylinder object; JiI, i = 1, 2, 3, are the MCP, PIP, and DIP joints, respectively; Li is the length of phalanges of each finger; θi is the joint rotation angle at each joint; and tiI is the width from the surface to skeleton (joint).
Close modal
Fig. 2
A plane triangle for trigonometric angle formulas
Fig. 2
A plane triangle for trigonometric angle formulas
Close modal
For triangle OCJ1IJ2I, the semiperimeter is
From the triangle, the following relations can be derived:
(1)
(2)
The rotation angle at the MCP joint can be calculated using Eq. (1) as follows:
(3)
For triangle OCJ2IJ3I, the semiperimeter is
and
(4)
(5)
By using Eqs. (2) and (4), the rotation angle of the PIP joint can be obtained as follows:
(6)
Finally, for triangle OCJ3ITI, the semiperimeter is
and
(7)
From the geometry of the triangles and Eqs. (5) and (7), the rotation angle at the DIP joint is obtained as follows:
(8)
It is worth to note that the following constraints on R should be taken into consideration to obtain the proper motion of the finger described in Eqs. (3), (6), and (8) from a geometric view point. For the representation of realistic joint rotations, the minimum R, Rmin, should be greater than or equal to the largest value among Eq. (9):
(9)
In the present work, the maximum R, Rmax, applicable to the model is chosen by considering a geometric constraint between MCP joints of the thumb and index finger and determined to satisfy the following relation:
which gives

As observed in Eqs. (3), (6), and (8), the joint rotation configuration can be determined by only a single parameter R given the fixed dimension of semi-thickness (i.e., t0I, t1I, t2I, and t3I) and length of phalanges (L1, L2, and L3). It is worth to note that the set of Eqs. (3), (6), and (8) represent a 1-DOF joint rotation configuration model of the human fingers. Furthermore, it is noted that these angles are not coupled. For the different profiles of R to time, both flexion and extension motions of a finger can be materialized.

Multiple works, mentioned above, control independent angular values to generate the grasping motion. Others utilize optimization methods to create such values through recursive calculation, which requires significant computational efforts. Some of the proposed solutions cannot coordinate the movement in time. Therefore, exploring new modeling methods is needed. The proposed model reduces the computational strain by utilizing a single variable which governs three angular values of each phalange to harmoniously coordinate each finger for generating the grasping motion. The merit of the proposed model is in its simplicity when used in the application to robotic hand design, assessment and control systems, such as articulated prosthetic devices, visual animation, and virtual reality, among others.

In order to investigate the changes in joint angles with the change of R over time, several sets of R profile have been considered. Figures 37 show the flexion motion of an index finger utilizing the proposed model with different profiles of R. The following values of the semi-thickness and phalangers’ length of the index finger under consideration are used for simulation: t0I = 1.5, t1I = 1.31, t2I = 0.94, and t3I = 0.49 cm. L1 = 4.54, L2 = 2.29, and L3 = 1.76 cm. In Fig. 3, R = R(t) is chosen to vary linearly with respect to time. As expected, the smaller the radius R, the larger the joint angles. However, it is observed that the calculated profile (behavior) of each joint angle seems not to be similar with that of a human finger [2,3,10,14,15]. Following this, different profiles of R are examined to see which of them is best suited for simulating the human finger motion. The candidate profiles under investigation are (1) parabola (Fig. 4), (2) hyperbolic tangent (Fig. 5), (3) cosine (Fig. 6), and (4) spline (Fig. 7) functions. For a hyperbolic tangent function, following the formula adopted from Ref. [3], the coefficient of the function is set as c1 = (Rmax + Rmin)/2, c2 = (RmaxRmin)/2, c3 = 0.8, and c4 = 0.4 for the calculation example. The spline curve was shaped with the points of (0, Rmax), (0.5, 0.9 Rmax), (1, 0.8(Rmax + Rmin)/2), (1.8, 1.2 Rmin), and (2, Rmin). From the calculation results, it can be noted that the configuration parameter R with the profile of hyperbolic tangent function seems to fit for describing the natural flexion motion of a finger, as reported in Refs. [3,10]. This can be easily justified from the plots of stepwise finger motion (shown in Figs. 37 with the time span of 0.1 s). Note that in Fig. 5, a broad span between consecutive movements is observed in the middle of the excursion of flexion, which means that the finger moves fast in the range. This notion will be discussed in more detail in Sec. 4. Joint configuration models for the third to fifth fingers and the thumb are given in Appendix  A.

Fig. 3
Calculated joint rotation angles for different values of the radius R. The radius R is chosen to vary linearly with respect to time. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Fig. 3
Calculated joint rotation angles for different values of the radius R. The radius R is chosen to vary linearly with respect to time. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Close modal
Fig. 4
Calculated joint rotation angles for changes in R of a parabola function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Fig. 4
Calculated joint rotation angles for changes in R of a parabola function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Close modal
Fig. 5
Calculated joint rotation angles for changes in R of a hyperbolic function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Fig. 5
Calculated joint rotation angles for changes in R of a hyperbolic function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Close modal
Fig. 6
Calculated joint rotation angles for changes in R of a cosine function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Fig. 6
Calculated joint rotation angles for changes in R of a cosine function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Close modal
Fig. 7
Calculated joint rotation angles for changes in R of a spline function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Fig. 7
Calculated joint rotation angles for changes in R of a spline function. (a) Profiles of R (dotted line) and derived joint angles over time and (b) the simulated stepwise motion of finger joints (displayed with a speed of 0.1 s per frame).
Close modal

3 Validation of the Finger Kinematic Model

For the evaluation of the proposed finger model, a set of motion capture experiments with two subjects has been performed. Optical markers were attached on the index finger of each subject, and the movement of the finger was recorded, as shown in Fig. 8(a). Through the motion capture process, finger movements have been made with several consecutive flexion and extension motions, and the planar coordinates for joints and the fingertip have been acquired. From the coordinates, the fingertip trajectory and joint angle data of each subject were calculated.

Fig. 8
Fingertip trajectory comparison of the model and the experimental data. (a) Photograph of the motion capture validation process, (b) experimental versus model results for subject 1 and (c) experimental versus model results for subject 2.
Fig. 8
Fingertip trajectory comparison of the model and the experimental data. (a) Photograph of the motion capture validation process, (b) experimental versus model results for subject 1 and (c) experimental versus model results for subject 2.
Close modal
A set of joint angles was calculated for the respective parameter R varying from 15 to 240 mm linearly using Eqs. (3), (6), and (8) (i.e., the joint angle configuration model). At the time of calculation, the anthropometric dimensions of each subject’s finger were considered. From the joint angles for each R, the trajectory (x3, y3) of the fingertip was calculated using
In order to see how much the proposed model can replicate a naturalistic motion of the human finger, the Cartesian fingertip trajectories from the model and the experiment for each subject were compared. The results are shown in Figs. 8(b) and 8(c). From the figures, one can see that the proposed model can replicate a naturalistic finger motion with a high degree of agreement during the flexion and extension movements.

A comparison of the joint angle configurations in joint space between the experiment and the model is shown in Table 1. Joint angles of the model for each point were calculated via Eqs. (3), (6), and (8) with the value of R corresponding to the coordinates of the fingertip point under consideration. It can be seen that the experiment and model tend to be very similar in the configurations of the joint angles for the similar fingertip points taken from the experiment and the model. In comparing both the trajectory and the joint angle configuration between the model and the experiments, it can be seen that the proposed model closely simulates the naturalistic motion of the human finger in Cartesian, as well as joint space.

Table 1

Comparison of joint angle configurations for the proposed model validation (for subject 1)

FingertipJoint angle
Pointx3 (mm)y3 (mm)R (mm)θ1 (deg)θ2 (deg)θ3 (deg)
AExp.58.4360.4942.4762.9750.19
Model56.7760.9021.4341.5966.8643.21
BExp.50.6458.9423.3339.6924.41
Model50.6658.9448.5223.4938.9925.33
CExp.76.2436.0411.2022.1711.35
Model76.2336.03129.3411.4720.5513.81
FingertipJoint angle
Pointx3 (mm)y3 (mm)R (mm)θ1 (deg)θ2 (deg)θ3 (deg)
AExp.58.4360.4942.4762.9750.19
Model56.7760.9021.4341.5966.8643.21
BExp.50.6458.9423.3339.6924.41
Model50.6658.9448.5223.4938.9925.33
CExp.76.2436.0411.2022.1711.35
Model76.2336.03129.3411.4720.5513.81

3.1 Geometry-Based Joint Configuration Model for Cylindroid Grasping.

The joint configuration model of an index finger using cylindroid real/virtual object grasping is given in Appendix  B. The proposed model can be treated as a one-DoF model: however, it is not suitable to be directly used in control systems design, since the joint angle configurations are obtained via the numerical approach of solving systems of nonlinear equations. Refer to Appendix  B for more details.

4 Interjoint and Interfinger Coordination

In many hand motions, several fingers work together to perform specific tasks. The notion of “working together” can be explained in terms of the multifinger-coordination function of the hand. The hand coordination is a combination of finger movements encompassing (1) interjoint coordination in each finger and (2) interfinger coordination in the hand.

From the survey of research works on multifinger coordination [2,8,10], it is observed that the interjoint coordination is related to the sequence of movement in the phalanges of a finger. For a voluntary extension movement (opening), a proximal-to-distal sequence is evident in the long fingers and distal-to-proximal sequence in the thumb. For the flexion movement (closing), reversed sequences are held in the long fingers and the thumb, respectively. Furthermore, in the motion analysis of interfinger coordination, the following results are given: the MCP joint of each finger moves together and has its peak velocity at approximately 50% movement in its excursion. Similar trends are observed in interphalangeal (IP) joints with their peak velocity at about 57% of the whole movement (see Ref. [10] for details).

In this work, the joint configuration model proposed in Sec. 2 is used to implement the concept of multifinger coordination. The notion of coordinated motions endows the model with the capability to simulate the natural characteristics of the human hand. The joint configuration model has one degree of freedom that is capable of determining each of the joint rotation angles (at the MCP, PIP, and DIP joints) in the finger simultaneously by only a single parameter R and representing the most elemental finger movement, i.e., flexion and extension in a quite simple way. Since the model has no constraints on the angles between joints and each joint angle is independently determined by just one single parameter, it can be easily used to implement interjoint coordination.

Figure 9 shows a schematic to explain the concept of the proposed coordinated motion of a hand. The respective solid-line blocks in each of the long fingers represent the proposed joint configuration kinematic model for the rotation angles of the MCP, PIP, and DIP joints, respectively. As described in Eqs. (3), (6), and (8), each joint angle is a function of the joint configuration parameter Rik which is the output from the interjoint motion coordination block assigned to the finger (represented as a broken-line block in the figure). The interjoint motion coordination block plays a role to set the profile of Rik and to provide each joint model with its value. The establishment of the profile Rik can be made based on the following need (1) to meet the sequence of movement in phalanges during flexion or extension movement (discussed in Ref. [10], for example) or (2) to make any specific shape of fingers.

Fig. 9
Schematic for the coordinated motion of hand
Fig. 9
Schematic for the coordinated motion of hand
Close modal
For the sequence of movement in the phalanges of a finger, the temporal aspect of interjoint coordination has been established by imposing the time delay into R profile by providing more natural behavior of the human finger. As an example, Fig. 10 shows a set of time-delayed R profiles to make a proper sequence of movement during the flexion movement of a finger. Let us assume that RI(t), t ≥ 0, is given for the flexion movement of the index finger. To establish the interfinger coordination motion discussed above, R profiles for the DIP, PIP, and MCP joints can be considered as follows:
where 0 < τ1 < τ2. For the illustrative purpose, τ1 = 0.15 and τ2 = 0.3 s are set (see the upper inset in the figure). For details on the time duration in the movement series, refer to the work of Nakamura et al. [15]. With the profiles, the DIP joint (θ3) connected to the distal phalanx starts its rotation motion first. For the time of τ1 (0.15 s in the example) from the initiation, the PIP and MCP joints remain standstill. After 0.15 s, the PIP joint (θ2) begins to move without any movement of the MCP joint. Finally, after 0.15 s from the PIP joint movement, the MCP joint (θ1) begins to move resulting in a proper ordered flexion movement.
Fig. 10
Example of R-profiles for interjoint coordination
Fig. 10
Example of R-profiles for interjoint coordination
Close modal

The coordination in the multifinger level is realized in the interfinger motion coordination block. It aims at determining human hand configurations that are shapes of hand which need to be made for performing specific tasks, such as grasp, or object manipulation [1618]. As mentioned earlier, the realization of the interjoint coordination using the proposed kinematic model is directly related to the temporal aspects of a single finger movement that considers the order of rotation initiation of each joint.

On the other hand, the interfinger coordination is thought to be related to the spatial aspects of multifinger movements that consider the arrangement of virtual cylinders with different radii Ri, i = I, M, R, P, respectively, following the outputs from grasping algorithms [19,20], control systems [2123], and a library/database of hand gesture ([24], for instance) to make a specific shape, as shown in Fig. 11.

Fig. 11
Example of a hand shape made by the thumb and long fingers with different flexion degrees in fingers which can be realized by grasping virtual cylinders with different radii each
Fig. 11
Example of a hand shape made by the thumb and long fingers with different flexion degrees in fingers which can be realized by grasping virtual cylinders with different radii each
Close modal

5 Conclusions

A novel finger kinematic model for the human hand configuration was derived based on the geometry of a long finger which is assumed to be softly enclosing the surface of a (virtual) cylindrical object. The proposed model (1) provides the joint rotation configuration (represented by three degrees of rotation motion) of the long finger with a single variable/control parameter which is the radius of the (virtual) cylindrical object, (2) calculates the values of the three joint rotation angles independently, and (3) describes the interjoint coordination function which can be combined with the interfinger coordination function to materialize the multifinger coordination function. The proposed index finger kinematic model can be easily applied to the model of other fingers (shown in Appendix  A). A kinematic finger model encompassing an elliptical virtual object was also taken into consideration in Appendix  B. It is shown that the model requires to solve nonlinear simultaneous equations numerically. Experimental validation of the model shows that it can simulate closely naturalistic human finger movements with applications in mimicking human hand shapes, as well as robotic hands design and control, among others.

Funding Data

  • NSF (CAREER Award Id # 1751770; Funder ID: 10.13039/100000001).

  • 2018 Research-Year Grant of Jeonju University, South Korea.

Appendix A: Joint Configuration Model Assuming the Fingers Encompass a Virtual Cylindrical Object

Joint Configuration Model for the Third to Fifth Fingers

For the third to fifth digits, it is assumed that each MCP joint of each finger has deviated from the origin J1I being the center of rotation of the MCP joint of the index finger. Let its deviation be sxij and syij where i = I and j = M, R and P. Note that the subscripts I, M, R, and P represent an index, middle, ring, and pinky (little) fingers, respectively.

From Fig. 12,
where sxIM and syIM are the coordinates of the MCP joint of the middle finger with respect to the origin J1I.
Fig. 12
Schematic of a middle finger joint configuration. The notion can be adopted to ring and little fingers, respectively, with different values of (sxIM, syIM). Dashed lines represent the thumb and index fingers.
Fig. 12
Schematic of a middle finger joint configuration. The notion can be adopted to ring and little fingers, respectively, with different values of (sxIM, syIM). Dashed lines represent the thumb and index fingers.
Close modal
For triangle OCJ1MJ2M, the semiperimeter is
and the following relations are hold:
where t0M and t1M are the lengths of O1MJ1M¯ and O2MJ2M¯, respectively; β1M=OCJ1MJ2M and γ1M=J1MJ2MOC.
Let νM = β1MϕM, then νM + θ1M = π/2. From this relation, the joint angle of MCP at the middle finger can be obtained as
(A1)
For triangle OCJ2MJ3M, the semiperimeter is
and
where β2M=OCJ2MJ3M and γ2M=J2MJ3MOC.
From the geometry, γ1M + β2M + θ2M = π. Then, the joint angle of PIP at the middle finger can be obtained as follows:
(A2)
Similarly, for triangle OCJ3MTM, the semiperimeter is
and
where β3M=OCJ3MTM.
From the geometry, γ2M + β3M + θ3M = π. Then, the joint angle of DIP at the middle finger can be obtained as follows:
(A3)
Equations (A1), (A2), and (A3) represent the joint rotation configuration of the middle finger of the human hand. For the proper joint motion, constraints on R should be considered as mentioned earlier. It is noted that this notion can be used to derive joint rotation configuration models for the ring and pinky fingers. Furthermore, it is worth to note that this is a one degree-of-freedom model which simulates joint rotation coordination that can be manipulated with a single variable R.

Joint Configuration Model for the Thumb

With the notation in Fig. 1, a planar thumb model for joint rotation configuration can be derived in a similar manner as in Sec. 2.2. For the purpose of simplicity, it is assumed that β0T and L0T related to the anthropometric data of the human hand are known.

For triangle OCJ1IJ1T, the semiparameter is
and the following relations are hold:
For triangle OCJ1TJ2T, the semiparameter is
and the following relations are hold:
From the angles obtained above and β1T + γ0T = θ1T + (π/2 − β0T), the joint rotation at the MCP joint of the thumb is
(A4)
For triangle OCJ2TTT, its semiparameter is
and the following relation is hold:
Again, from the angles obtained above and γ1T + β3T + θ3T = π/2, the joint rotation at the IP joint of the thumb is
(A5)
Similarly, one should consider constraints on R for the proper motion of the thumb described in Eqs. (A4) and (A5).

Appendix B: Joint Configuration Model Assuming the Fingers Are Encompassing a Virtual Cylindroid Object

In this section, the second joint rotation configuration model is proposed by considering another virtual object being encompassed by fingers. In the work, a naturalistic shape of fingers of the human hand is again assumed as one that is observed when a human hand is grasping a (virtual) cylindroid object with an elliptical cross section as shown in Fig. 13. By maintaining numerical eccentricity (defined as e = c/a, where c=(a2b2)) of the ellipse constant, one can obtain a joint rotation configuration kinematic model of the index finger which can be manipulated by a single control variable, either a or b.

Fig. 13
Schematic of the index finger encompassing a virtual cylindroid object
Fig. 13
Schematic of the index finger encompassing a virtual cylindroid object
Close modal

The joint angle configuration of the index finger can be obtained by formulating and solving geometric constraints which need to be met to describe the posture of the fingers when they encompass the virtual elliptic object under consideration. These constraints are described with the coordinates of the points of interest. Let us begin with the geometric aspect of the proximal phalanx of the index finger shown in Fig. 13.

The point A1I, (x1, y1) is on the ellipse and the following holds true:
(B1)
For the proximal phalanx A0IA1I¯
(B2)
The distance between points A1I and A1I is
(B3)
where t1I=|A1IA1I¯|.
The slope-equality condition between OEA1I¯ and A1IA1I¯ should be met
(B4)
The coordinates of point A1I (and A1I) can be found by numerically solving systems of nonlinear equations of Eqs. (B1)(B4) (four equations and four unknowns—x1, y1, x1, and y1). Finally, the joint angle of the MCP in the index finger is obtained as
(B5)

The joint rotation angle of the PIP joint in the index finger is obtained by the procedure described above.

Constraint CP1 at point A2I
(B6)
Constraint CP2 on the middle phalanger L2I
(B7)
Constraint CP3 at points A2I and A2I
(B8)
where t2I=|A2IA2I¯|.
Constraint CP4 between OEA2I¯ and A2IA2I¯
(B9)
The coordinates of point A2I (and A2I) can be found by solving systems of nonlinear equations, i.e., Eqs. (B6)(B9). The joint angle of the PIP is then obtained as
(B10)

Using the procedure above, the joint rotation configuration of the DIP joint in the index finger is obtained.

Constraint CD1 at point A3I
(B11)
Constraint CD2 on the distal phalanger L3I
(B12)
Constraint CD3 at points A3I and A3I
(B13)
where t3I=|A3IA3I¯|.
Constraint CD4 between OEA3I¯ and A3IA3I¯
(B14)
The coordinates of point A3I (and A3I) can be found by solving the system of nonlinear equations shown above. The joint angle of the DIP is then obtained as
(B15)

Figure 14 shows how the joint angle of each joints, MCP, PIP, and DIP, respectively, are calculated in consecutive order. Two examples of the calculation with different sets of a and b are shown in Fig. 15. It is noted that the model should be solved numerically to get the joint angle configuration of a finger.

Fig. 14
Joint angle calculation procedure
Fig. 14
Joint angle calculation procedure
Close modal
Fig. 15
Example of joint rotation configuration calculation with a cylindroid of (1) a = 2.5 and b = 2 cm and (2) a = 4 and b = 3.2 cm. Note: t0I = 1.5, t1I = 1.2609, t2I = 0.9321, t3I = 0.5532 cm, L1I = 4.0895, L2I = 1.9317, and L3I = 1.8322 cm.
Fig. 15
Example of joint rotation configuration calculation with a cylindroid of (1) a = 2.5 and b = 2 cm and (2) a = 4 and b = 3.2 cm. Note: t0I = 1.5, t1I = 1.2609, t2I = 0.9321, t3I = 0.5532 cm, L1I = 4.0895, L2I = 1.9317, and L3I = 1.8322 cm.
Close modal

Note that when a = b, the object has a circular cross section. The joint rotation configuration can be obtained numerically by solving the constraints (with a = b = R) discussed here as well as the formulas in Sec. 2.2.

References

1.
Barbagli
,
F.
,
Frisoli
,
A.
,
Salisbury
,
K.
, and
Bergamasco
,
M.
,
2004
, “
Simulating Human Fingers: A Soft Finger Proxy Model and Algorithm
,”
HAPTICS’04 Proceedings, 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
,
Chicago, IL
,
Mar. 27–28
, pp.
9
17
.
2.
Kim
,
B.-H.
,
2014
, “
Analysis of Coordinated Motions of Humanoid Robot Fingers Using Interphalangeal Joint Coordination
,”
Int. J. Adv. Rob. Syst.
,
11
(
69
), pp.
1
8
.
3.
Braido
,
P.
, and
Zhang
,
X.
,
2004
, “
Quantitative Analysis of Finger Motion Coordination in Hand Manipulative and Gestic Acts
,”
Hum. Mov. Sci.
,
22
(
6
), pp.
661
678
. 10.1016/j.humov.2003.10.001
4.
PenaPitarch
,
E.
,
Falguera
,
N. T.
, and
Yang
,
J. J.
,
2012
, “
Virtual Human Hand: Model and Kinematics
,”
Comput. Methods Biomech. Biomed. Eng.
,
17
(
5
), pp.
568
579
. 10.1080/10255842.2012.702864
5.
Miyata
,
N.
,
Kouchi
,
M.
, and
Mochimaru
,
M.
,
2007
, “
Generation and Validation of 3D Links for Representative Hand Models
,”
SAE 2007 Digital Human Modeling for Design and Engineering Conference
,
Seattle, WA
,
June 12–14
, Paper No. 2007-01-2512.
6.
Miyata
,
N.
,
Kouchi
,
M.
,
Mochimaru
,
M.
,
Kawachi
,
L.
, and
Kurihara
,
T.
,
2005
, “
Hand Link Modeling and Motion Generation From Motion Capture Data Based on 3D Joint Kinematics
,”
SAE 2005 Digital Human Modeling for Design and Engineering Symposium
,
Iowa City, IA
,
June 14–16
, Paper No. 2005-01-2687.
7.
Savescu
,
A.
,
Cheze
,
L.
,
Wang
,
X.
,
Beurier
,
G.
, and
Verriest
,
J.
,
2004
, “
A 25 Degrees of Freedom Hand Geometrical Model for Better Hand Attitude Simulation
,”
SAE 2004 Digital Human Modeling for Design and Engineering Symposium
,
Rochester, MI
,
June 15–17
, Paper No. 2004-01-2196.
8.
McDonald
,
J.
,
Toro
,
J.
,
Alkoby
,
K.
,
Berthiaume
,
A.
,
Carter
,
R.
,
Chomwong
,
P.
,
Christopher
,
J.
,
Davidson
,
M. J.
,
Furst
,
J.
,
Konie
,
B.
,
Lancaster
,
G.
,
Roychoudhuri
,
L.
,
Sedgwick
,
E.
,
Tomuro
,
N.
, and
Wolfe
,
R.
,
2001
, “
An Improved Articulated Model of the Human Hand
,”
Vis. Comput.
,
17
(
8
), pp.
158
166
. 10.1007/s003710100104
9.
Nolker
,
C.
, and
Ritter
,
H.
,
2002
, “
Visual Recognition of Continuous Hand Postures
,”
IEEE Trans. Neural Netw.
,
13
(
4
), pp.
983
994
. 10.1109/TNN.2002.1021898
10.
Carpinella
,
I.
,
Jonsdottir
,
J.
, and
Ferrarin
,
M.
,
2011
, “
Multi-Finger Coordination in Healthy Subjects and Stroke Patients: A Mathematical Modelling Approach
,”
J. NeuroEng. Rehabil.
,
8
(
19
), pp.
1
19
.
11.
Moyer
,
T.
,
Faulrig
,
E. L.
, and
Santos-Munne
,
J. J.
,
2013
, “
One Motor Finger Mechanism
,” U.S. Patent No. 8470051B2.
12.
Lippert
,
L. S.
,
2006
,
Clinical Kinesiology and Anatomy
, 4th ed.,
F. A. Davis Company
,
Philadelphia, PA
.
13.
Polyanin
,
A. D.
, and
Manzhirov
,
A. V.
,
2007
,
Handbook of Mathematics for Engineers and Scientists
,
Chapman & Hall/CRC
,
Boca Raton, FL
.
14.
Santello
,
M.
,
Flanders
,
M.
, and
Soechting
,
J. F.
,
2002
, “
Patterns of Hand Motion During Grasping and the Influence of Sensory Guidance
,”
J. Neurosci.
,
22
(
4
), pp.
1426
1435
. 10.1523/JNEUROSCI.22-04-01426.2002
15.
Nakamura
,
M.
,
Miyawaki
,
C.
,
Matsushita
,
N.
,
Yagi
,
R.
, and
Handa
,
Y.
,
1998
, “
Analysis of Voluntary Finger Movements During Hand Tasks by a Motion Analyzer
,”
J. Electromyogr. Kinesiol.
,
8
(
5
), pp.
295
303
. 10.1016/S1050-6411(97)00040-0
16.
Feix
,
T.
,
Romero
,
J.
,
Schmiedmayer
,
H.
,
Dollar
,
A. M.
, and
Kragic
,
D.
,
2016
, “
The GRASP Taxonomy of Human Grasp Types
,”
IEEE Trans. Hum. Mach. Syst.
,
46
(
1
), pp.
66
77
. 10.1109/THMS.2015.2470657
17.
Cutkosky
,
M.
,
1989
, “
On Grasp Choice, Grasp Models and the Design of Hands for Manufacturing Tasks
,”
IEEE Trans. Rob. Autom.
,
5
(
3
), pp.
269
279
. 10.1109/70.34763
18.
Yu
,
Y.
,
Li
,
Y.
, and
Tsujio
,
S.
,
2001
, “
Analysis of Finger Position Regions on Grasped Object With Multifingered Hand
,”
Proceedings of the 2001 IEEE International Symposium on Assembly and Task Planning (ISATP2001)
,
Fukuoka, Japan
,
May 28–30
, pp.
178
183
.
19.
Bicchi
,
A.
,
1995
, “
On the Closure Properties of Robotic Grasping
,”
Int. J. Rob. Res.
,
14
(
4
), pp.
319
334
. 10.1177/027836499501400402
20.
Bowers
,
D.
, and
Lumia
,
R.
,
2003
, “
Manipulation of Unmodeled Objects Using Intelligent Grasping Schemes
,”
IEEE Trans. Fuzzy Syst.
,
11
(
3
), pp.
320
330
. 10.1109/TFUZZ.2003.812689
21.
Nagai
,
K.
, and
Yoshikawa
,
T.
,
1993
, “
Dynamic Manipulation/Grasping Control of Multifingered Robot Hands
,”
Proceedings of IEEE International Conference on Robotics and Automation
,
Atlanta, GA
,
May 2–6
, pp.
1027
1033
.
22.
Nagai
,
K.
, and
Yoshikawa
,
T.
,
1995
, “
Grasping and Manipulation by Arm/Multifingered-Hand Mechanisms
,”
Proceedings of IEEE International Conference on Robotics and Automation
,
Nagoya, Japan
,
May 21–27
, pp.
1040
1047
.
23.
Nagashima
,
T.
,
Seki
,
H.
, and
Tanako
,
M.
,
1997
, “
Analysis and Simulation of Grasping/Manipulation by Multi-Finger Surface
,”
Mech. Mach. Theory
,
32
(
2
), pp.
175
191
. 10.1016/S0094-114X(96)00054-7
24.
Molina-Vilaplana
,
J.
, and
Lopez-Coronado
,
J.
,
2006
, “
A Neural Network Model for Coordination of Hand Gesture During Reach to Grasp
,”
Neural Netw.
,
19
(
1
), pp.
12
30
.