Due to the different use of color communication between 3D image capture and 3D image printing, there are often significant color differences between printed and original objects when using this type of manufacturing process. Conventional color image reproduction techniques based on CIE color measurement have been used for over 80 years and perform very well in converting color images from one digital medium to another under various viewing conditions. However, traditional color image reproduction techniques for 3D printing technology are not that easy to apply.
With this protocol, 3D color objects can be split into monochrome 3D objects and 2D color images. With this in mind, a traditional color reproduction technique can be applied to convert a 2D color image from camera RGB to the corresponding printer RGB through human color appearance attributes. In order to achieve accurate color reproduction through 3D imaging devices, specific color profiles must be developed to connect the color system of a particular device to the human visual system.
Color Profiles and Development for 3D Display Devices
To reproduce a true color from a 3D camera to a 3D printer, both a camera color profile and a printer color profile must be developed to link the camera RGB and printer RGB to human eye response (CIE XYZ three-dimensional values). Variations in skin color and tones are noteworthy and can be used to provide a multitude of educational colors to experiment and include the full spectrum of skin colors available for different human populations templates such as digital Macbeth ColorCheckerDC charts (X-Rite Inc., Grand Rapids, MI, USA) . Using this method, a 2D color chart can be converted into a 3D model of dimensions 200 (l) × 150 (w) × 3 (h) with desired colors and printed using a 3D printer such as the Z Corp Z510 color printer.
These can then be used as test colors and then the CIE XYZ tristimulus values for each training color produced in the printed color chart can be obtained by taking color measurements using a spectrophotometer such as Minolta CM 2600d (Konica Minolta Inc., Tokyo, Japan). During measurements, standard viewing conditions such as d / 8 imaging geometry (diffuse illumination, 8 degree viewing) should be applied, the specular component should be included, and the aperture dimensions should be set to a consistently defined diameter – 3mm. The illuminator should also be consistent and ideally conform to an industry standard, including the CIE standard D65 and the CIE 1931 standard observer to simulate skin color in daylight conditions.
Based on printer RGB and CIE XYZ tristimulus values for training colors, a printer color profile can be developed using a third order polynomial regression model. A camera color profile can also be generated using the same table. Color images are captured by the 3dMD camera system and camera RGB for the number of training colors used. Then, based on these camera RGB and CIE XYZ tristimulus values, a camera color profile can be developed using a quadratic polynomial regression. Using this method, for each pixel of the image, camera RGB is first converted to CIE XYZ tristimulus values and then converted back to printer RGB.
Color Reproduction Evaluation
To evaluate color reproduction for the human face, a color test chart must first be designed. This can be achieved using 14 preset human skin colors, including four Caucasian, two Chinese, two Asian, four African and two Caribbean skin colors. Next, a 3D color scheme with defined and consistent dimensions must be created using the color printer to be used (Z Corp Z510). After the final treatment so that any final prosthesis can be applied, this table is called the original color scheme.
Color reproduction can then be evaluated using two reproduction tables that must be produced using two different 3D color image reproduction systems. When the first chart is generated, the color image of the original chart must be captured using the 3dMD camera system and then processed with only minor corrections to the 3D geometry before it is sent to the Z Corp Z510 printer for 3D printing. This printed color scheme can then be referred to as the first reproduction. For the second process, it can be performed following the proposed 3D color image reproduction process. This printed table can then be referred to as the second reproduction table.
To evaluate the performance of color reproduction, CIE XYZ tristimulus values for each color patch in each color scheme should be measured using a spectrophotometer (Konica Minolta cm-2600d). The color difference between the original chart for each of the 14 test colors under the CIE illuminator D65 and each of the two reproduction tables should be calculated using a CIELAB color difference formula. Average, maximum, minimum and standard deviation for color difference can be recorded and tabulated. Examples of these values can be seen. When applied correctly, this method will show that a significant improvement can be achieved in color reproduction using 3D color image reproduction systems. For successful application in the production of facial soft tissue prostheses, an acceptable color difference for 3D printed objects is about 3–4Eab.
Color Texture Mapping
Most 3D photogrammetry systems can give the illusion of texture by wrapping a 2D image over a 3D surface. This wrapped texture does not actually create fine wrinkles, pores or blemishes on the skin surface and therefore cannot be reproduced in the rapid prototyping process. It shows the side-by-side view of the 3D polygon mesh and the 2D bitmap image captured by the 3D camera system (3dMD System). From the image it can be seen that the 3D mesh actually does not contain fine details including pores and wrinkles clearly visible in the 2D bitmap. Each polygon in the 3D network connects to a specific region in the 2D bitmap. Shows the highlighted polygons in the 3D mesh and the region used to place the color from the 2D bitmap on a monochrome polygon.
Various techniques have been developed to increase realism and improve the characterization of the patient’s skin. One such method involves the incorporation of surface details such as pores, wrinkles, and fine lines into the 3D model. Height area mapping (also known as bump mapping) in the CAD design process can be used to convert the texture reference image to a geometric pattern. Height field mapping is a method of converting thin 2D images into 3D geometric virtual models. This is based on the white (high) and black (low) grayscale texture reference map as described above and can be applied to map the representative texture to the appropriate areas of the face model. These techniques can use not only patient-specific data from adjacent facial anatomy / topography, but also pre-treatment 2D photographs of the same area.
The second allows for manually adding realistic skin textures to the prosthetic model surface in cases where tissue from other parts of the missing face is not suitable or inconsistent or represents the area to be replaced. Depending on the gray level intensity (from white to black), the computer software can control the depth of imperfections on the skin surface. It can be used to add texture on a flat surface while showing an original skin depth map. The surface topography varies according to the gray level density and the location of individual pores and wrinkles are shown. The resulting mesh can be 3D printed on a flat surface (c) to create a texture similar to the appearance.
Given the flexibility of this type of software, the texture can be mapped not only to flat surfaces but also to complex 3D shapes. However, the detail provided in the final prosthesis depends not only on the resolution of the 3D data obtained, but also on the resolution of the 3D printer and the properties of the type of powder and binder used in the process. Of course, the use of powder in the printing process will reduce the details of the texture derived from the height field mapping, even if such details can be mapped in the CAD process. On the contrary, using a finer powder will allow very fine details to be added on the printed prosthesis.
Writer: Ozlem Guvenc Agaoglu