Creat membership Creat membership
Sign in

Forgot password?

Confirm
  • Forgot password?
    Sign Up
  • Confirm
    Sign In
home > search

Now showing items 1 - 16 of 25

  • Estimation of Citrus Maturity with Florescence Spectroscopy Using Deep Learning

    Itakura, Kenta   Saito, Yoshito   Suzuki, Tetsuhito   Kondo, Naoshi   Hosoi, Fumiki  

    Download Collect
  • Estimation of Citrus Maturity with Fluorescence Spectroscopy Using Deep Learning

    Itakura, Kenta   Saito, Yoshito   Suzuki, Tetsuhito   Kondo, Naoshi   Hosoi, Fumiki  

    To produce high-quality citrus, the harvest time of citrus should be determined by considering its maturity. To evaluate citrus maturity, the Brix/acid ratio, which is the ratio of sugar content or soluble solids content to acid content, is one of the most commonly used indicators of fruit maturity. To estimate the Brix/acid ratio, fluorescence spectroscopy, which is a rapid, sensitive, and cheap technique, was adopted. Each citrus peel was extracted, and its fluorescence value was measured. Then, the fluorescent spectrum was analyzed using a convolutional neural network (CNN). In fluorescence spectroscopy, a matrix called excitation and emission matrix (EEM) can be obtained, in which each fluorescence intensity was recorded at each excitation and emission wavelength. Then, by regarding the EEM as an image, the Brix/acid ratio of juice from the flesh was estimated via performing a regression with a CNN (CNN regression). As a result, the Brix/acid ratio absolute error was estimated to be 2.48, which is considerably better than the values obtained by the other methods in previous studies. Hyperparameters, such as depth of layers, learning rate, and the number of filters used for this estimation, could be observed using Bayesian optimization, and the optimization contributed to the high accuracy.
    Download Collect
  • Three-Dimensional Monitoring of Plant Structural Parameters and Chlorophyll Distribution

    Itakura, Kenta   Kamakura, Itchoku   Hosoi, Fumiki  

    Image analysis is widely used for accurate and efficient plant monitoring. Plants have complex three-dimensional (3D) structures; hence, 3D image acquisition and analysis is useful for determining the status of plants. Here, 3D images of plants were reconstructed using a photogrammetric approach, called structure from motion. Chlorophyll content is an important parameter that determines the status of plants. Chlorophyll content was estimated from 3D images of plants with color information. To observe changes in the chlorophyll content and plant structure, a potted plant was kept for five days under a water stress condition and its 3D images were taken once a day. As a result, the normalized Red value and the chlorophyll content were correlated; a high R-2 value (0.81) was obtained. The absolute error of the chlorophyll content estimation in cross-validation studies was 4.0 x 10(-2) g/mm(2). At the same time, the structural parameters (i.e., the leaf inclination angle and the azimuthal angle) were calculated by simultaneously monitoring the changes in the plant's status in terms of its chlorophyll content and structural parameters. By combining these parameters related to plant information in plant image analysis, early detection of plant stressors, such as water stress, becomes possible.
    Download Collect
  • Three-Dimensional Monitoring of Plant Structural Parameters and Chlorophyll Distribution

    Itakura, Kenta   Kamakura, Itchoku   Hosoi, Fumiki  

    Download Collect
  • Estimation of Leaf Inclination Angle in Three-Dimensional Plant Images Obtained from Lidar

    Itakura, Kenta   Hosoi, Fumiki  

    The leaf inclination angle is a fundamental variable for determining the plant profile. In this study, the leaf inclination angle was estimated automatically from voxel-based three-dimensional (3D) images obtained from lidar (light detection and ranging). The distribution of the leaf inclination angle within a tree was then calculated. The 3D images were first converted into voxel coordinates. Then, a plane was fitted to some voxels surrounding the point (voxel) of interest. The inclination angle and azimuth angle were obtained from the normal. The measured leaf inclination angle and its actual value were correlated and indicated a high correlation (R-2 =3D 0.95). The absolute error of the leaf inclination angle estimation was 2.5 degrees. Furthermore, the leaf inclination angle can be estimated even when the distance between the lidar and leaves is about 20 m. This suggests that the inclination angle estimation of leaves in a top part is reliable. Then, the leaf inclination angle distribution within a tree was calculated. The difference in the leaf inclination angle distribution between different parts within a tree was observed, and a detailed tree structural analysis was conducted. We found that this method enables accurate and efficient leaf inclination angle distribution.
    Download Collect
  • Estimation of Leaf Inclination Angle in Three-Dimensional Plant Images Obtained from Lidar

    Itakura, Kenta   Hosoi, Fumiki  

    Download Collect
  • 3D lidar imaging for detecting and understanding plant responses and canopy structure

    Omasa, Kenji   Hosoi, Fumiki   Konishi, Atsumi  

    Understanding and diagnosing plant responses to stress will benefit greatly from three-dimensional (3D) measurement and analysis of plant properties because plant responses are strongly related to their 3D structures. Light detection and ranging (lidar) has recently emerged as a powerful tool for direct 3D measurement of plant structure. Here the use of 3D lidar imaging to estimate plant properties such as canopy height, canopy structure, carbon stock, and species is demonstrated, and plant growth and shape responses are assessed by reviewing the development of lidar systems and their applications from the leaf level to canopy remote sensing. In addition, the recent creation of accurate 3D lidar images combined with natural colour, chlorophyll fluorescence, photochemical reflectance index, and leaf temperature images is demonstrated, thereby providing information on responses of pigments, photosynthesis, transpiration, stomatal opening, and shape to environmental stresses; these data can be integrated with 3D images of the plants using computer graphics techniques. Future lidar applications that provide more accurate dynamic estimation of various plant properties should improve our understanding of plant responses to stress and of interactions between plants and their environment. Moreover, combining 3D lidar with other passive and active imaging techniques will potentially improve the accuracy of airborne and satellite remote sensing, and make it possible to analyse 3D information on ecophysiological responses and levels of various substances in agricultural and ecological applications and in observations of the global biosphere.
    Download Collect
  • Comparison between Rice Plant Traits and Color Indices Calculated from UAV Remote Sensing Images

    Shimojima, Kohei   Ogawa, Satoshi   Naito, Hiroki   Valencia, Milton Orlando   Shimizu, Yo   Hosoi, Fumiki   Uga, Yusaku   Ishitani, Manabu   Selvaraj, Michael Gomez   Omasa, Kenji  

    Remote sensing technology for monitoring plant trains has a huge potential to accelerate breeding process. In this paper, we have studied on remote sensing of using an unmanned aerial vehicle (UAV) system for plant traits phenotyping in rice. The images of rice canopy were taken by a RGB camera from the UAV at three growing stages; Vegetative (VG), Flowering (FW) and Grain filling (GF). Typical color indices (r, g, b, INT, VIG, L*, a*, b*, H) were calculated by image processing. Single regression analysis was conducted between rice plant traits (leaf area index (LAI), grain yield, above ground biomass, plant height, panicle length, grain filling rate, tiller number) and color indices. The index a* at FW and GF had close liner relationships with LAI (the coefficient of determination R-2 > 0.70) and grain yield (R-2 > 0.50). Moreover, a* and g at FW and GF showed high R-2 with plant height and grain filling rate (R-2 > 0.50). The R-2 between grain yield and color indices increased above 0.5 for about 40 of models at three growing stages by multiple regression analysis. In particular, the models of H and INT and of H and L* at VG were closely related (R-2 > 0.70). Our findings show the analysis of color images taken by UAV remote sensing is useful to assessing four rice traits; LAI, grain yield, plant height and grain filling rate at early stage, and especially more available for grain yield estimation.
    Download Collect
  • MODIS vegetation and water indices for drought assessment in semi-arid ecosystems of Iran

    RAHIMZADEH BAJGIRAN, Parinaz   SHIMIZU, Yo   HOSOI, Fumiki   OMASA, Kenji  

    Download Collect
  • Estimating 3D Leaf and Stem Shape of Nursery Paprika Plants by a Novel Multi-Camera Photography System

    Zhang, Yu   Teng, Poching   Shimizu, Yo   Hosoi, Fumiki   Omasa, Kenji  

    For plant breeding and growth monitoring, accurate measurements of plant structure parameters are very crucial. We have, therefore, developed a high efficiency Multi-Camera Photography (MCP) system combining Multi-View Stereovision (MVS) with the Structure from Motion (SfM) algorithm. In this paper, we measured six variables of nursery paprika plants and investigated the accuracy of 3D models reconstructed from photos taken by four lens types at four different positions. The results demonstrated that error between the estimated and measured values was small, and the root-mean-square errors (RMSE) for leaf width/length and stem height/diameter were 1.65 mm (R-2 =3D 0.98) and 0.57 mm (R-2 =3D 0.99), respectively. The accuracies of the 3D model reconstruction of leaf and stem by a 28-mm lens at the first and third camera positions were the highest, and the number of reconstructed fine-scale 3D model shape surfaces of leaf and stem is the most. The results confirmed the practicability of our new method for the reconstruction of fine-scale plant model and accurate estimation of the plant parameters. They also displayed that our system is a good system for capturing high-resolution 3D images of nursery plants with high efficiency.
    Download Collect
  • High-power dye laser using steady-state amplification with chirped pulses

    Hosoi, Fumiki   Shimura, Masaru   Nabekawa, Yasuo   Kondo, Kiminori   Watanabe, Shuntaro  

    Download Collect
  • Estimation of tree structure parameters from video frames with removal of blurred images using machine learning

    ITAKURA, Kenta   HOSOI, Fumiki  

    Download Collect
  • Detecting seasonal change of broad-leaved woody canopy leaf area density profile using 3D portable LIDAR imaging

    Hosoi, Fumiki   Omasa, Kenji  

    Seasonal change of vertical leaf area density (LAD) profiles of woody canopy broad-leaved trees (Zelkova serrata [Thunberg] Makino) was estimated using 3D portable scanning light detection and ranging (LIDAR) imaging. First, 3D point cloud data for the canopy were collected using a portable LIDAR in spring, summer, autumn and winter. For data collection, the canopy was evenly scanned by the LIDAR from three positions 10m above the ground. Next, the vertical LAD profile in each season was computed from the LIDAR data using the voxel-based canopy profiling (VCP) method. For the computation, non-photosynthetic tissues were eliminated using the LIDAR data obtained during winter. Influence of leaf inclination angle (LIA) on LAD estimation was corrected by LIA data measured by a high-resolution portable scanning LIDAR. The resultant profiles showed that LAD values tended to increase at the upper canopy from spring to summer and decrease at the middle and lower canopy from summer to autumn. Moreover, LIDAR-derived LIA distributions were compared among different seasons. LIA showed an even distribution in spring but changed to a planophile distribution in summer. In autumn, the angles in the <30 degrees class decreased and those between the 30 and 40 degrees classes increased.
    Download Collect
  • 3-D Modeling of Tomato Canopies Using a High-Resolution Portable Scanning Lidar for Extracting Structural Information

    Hosoi, Fumiki   Nakabayashi, Kazushige   Omasa, Kenji  

    In the present study, an attempt was made to produce a precise 3D image of a tomato canopy using a portable high-resolution scanning lidar. The tomato canopy was scanned by the lidar from three positions surrounding it. Through the scanning, the point cloud data of the canopy were obtained and they were co-registered. Then, points corresponding to leaves were extracted and converted into polygon images. From the polygon images, leaf areas were accurately estimated with a mean absolute percent error of 4.6%. Vertical profile of leaf area density (LAD) and leaf area index (LAI) could be also estimated by summing up each leaf area derived from the polygon images. Leaf inclination angle could be also estimated from the 3-D polygon image. It was shown that leaf inclination angles had different values at each part of a leaf.
    Download Collect
  • 3-D voxel-based solid modeling of a broad-leaved tree for accurate volume estimation using portable scanning lidar

    Hosoi, Fumiki   Nakai, Yohei   Omasa, Kenji  

    We developed a method to produce a 3-D voxel-based solid model of a tree based on portable scanning lidar data for accurate estimation of the volume of the woody material. First, we obtained lidar measurements with a high laser pulse density from several measurement positions around the target, a Japanese zelkova tree. Next, we converted lidar-derived point-cloud data for the target into voxels. The voxel size was 0.5 cm times 0.5 cm times 0.5 cm. Then, we used differences in the spatial distribution of voxels to separate the stem and large branches (diameter > 1 cm) from small branches (diameter les 1 cm). We classified the voxels into sets corresponding to the stem and to each large branch and then interpolated voxels to fill out their surfaces and their interiors. We then merged the stem and large branches with the small branches. The resultant solid model of the entire tree was composed of consecutive voxels that filled the outer surface and the interior of the stem and large branches, and a cloud of voxels equivalent to small branches that were discretely scattered in mainly the upper part of the target. Using this model, we estimated the woody material volume by counting the number of voxels in each part and multiplying the number of voxels by the unit voxel volume (0.13 cm 3). The percentage error of the volume of the stem and part of a large branch was 0.5%. The estimation error of a certain part of the small branches was 34.0%. [All rights reserved Elsevier].
    Download Collect
  • Voxel-based 3-D modeling of individual trees for estimating leaf area density using high-resolution portable scanning lidar

    Hosoi, Fumiki   Omasa, Kenji  

    A method for accurate estimation of leaf area density (LAD) and the cumulative leaf area index (LAI) profiles of small trees (Camellia sasanqua and Deutzia crenata) under different conditions was demonstrated, which used precise voxel-based tree models produced by high-resolution portable scanning lidar. In this voxel-based canopy profiling (VCP) method, data for each horizontal layer of the canopy of each tree were collected from symmetrical azimuthal measurement points around the tree using optimally inclined laser beams. The data were then converted into a voxel-based three-dimensional model that reproduced the tree precisely, including within the canopy. This precise voxel model allowed the LAD and LAI of these trees, which have extremely dense and nonrandomly distributed foliage, to be computed by direct counting of the beam-contact frequency in each layer using a point-quadrat method. Corrections for leaf inclination and nonphotosynthetic tissues reduced the. estimation error. A beam incident zenith angle near 57.5 degrees offered a good correction for leaf inclination without knowledge of the actual leaf inclination. Nonphotosynthetic tissues were removed by image-processing techniques. The best LAD estimations showed errors of 17% at the minimum horizontal layer thickness and of 0.7% at the maximum thickness. The error of the best LAI estimations was also 0.7%.
    Download Collect
1 2

Contact

If you have any feedback, Please follow the official account to submit feedback.

Turn on your phone and scan

Submit Feedback