close

Вход

Забыли?

вход по аккаунту

?

JP2016033764

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2016033764
A display device that outputs information in response to external factors external to the device, a
control method of the display device, and a program are provided. A head-mounted display
device 100 includes an image display unit 20 which is worn on a user's body and used, transmits
an outside scene, and displays an image so as to be visible with the outside scene. The headmounted display device 100 also includes a right headphone 32 and a left headphone 34 that
output sound. Furthermore, the target detection unit 171 that detects an object in the direction
of the line of sight of the user, the distance detection unit 173 that detects the distance between
the detected object and the user, and the right headphone 32 and the left headphone according
to the detected distance. And 34 an information output control unit 179 that controls the output
of the sound of. [Selected figure] Figure 2
Display device, control method of display device, and program
[0001]
The present invention relates to a display device, a control method of the display device, and a
program.
[0002]
2. Description of the Related Art Conventionally, wearable display devices having a function of
displaying a text are known (see, for example, Patent Document 1).
09-05-2019
1
The device described in Patent Document 1 changes display attributes such as font size and
character color for a part of characters or words of text data so that the user can grasp the
content of displayed text data.
[0003]
JP, 2014-56217, A
[0004]
For example, the device described in Patent Document 1 changes the display attribute in
accordance with the content of the information to be displayed.
Thus, although a method for controlling the output corresponding to the information to be
displayed has been proposed, there has been no method for controlling the output of the
information corresponding to the external situation of the device or the user who uses the device.
. The present invention has been made in view of the above-described circumstances, and it is an
object of the present invention to provide a display device that outputs information in response
to external factors external to the device, a control method of the display device, and a program. I
assume.
[0005]
In order to achieve the above object, the display device of the present invention is a display
device worn on a user's body and used to transmit an outside scene and display an image so as to
be visible with the outside scene. A sound output unit for outputting a sound, an object detection
unit for detecting an object in the direction of the line of sight of the user, and a distance
detection for detecting the distance between the object detected by the object detection unit and
the user A sound control unit configured to control the output of the sound of the sound output
unit in accordance with the distance detected by the distance detection unit. According to the
present invention, sound can be output in response to external factors of the display device.
[0006]
09-05-2019
2
In the display device according to the present invention, the display device further includes a
data acquisition unit that acquires sound data related to the object, and the sound control unit
detects that the distance detected by the distance detection unit is longer than a preset distance.
The sound output unit may output the sound acquired by the data acquisition unit. According to
the present invention, it is possible to control the output of sound in accordance with the
distance between the display device and the object.
[0007]
In the display device according to the present invention, the display device further includes an
imaging unit configured to capture an eye direction of the user, and the object detection unit
transmits the display unit based on an image captured by the imaging unit. Detecting the object
to be visually recognized. According to the present invention, an object in the direction of the
user's line of sight can be detected based on a captured image.
[0008]
Further, according to the present invention, in the display device, the target detection unit
detects a state of the target which the user visually recognizes through the display unit based on
an image captured by the imaging unit. The control unit is characterized in that the sound output
unit outputs the sound acquired by the data acquisition unit in accordance with the state of the
object detected by the object detection unit. According to the present invention, sound can be
output corresponding to the state of an object detected from a captured image.
[0009]
Furthermore, in the display device according to the present invention, the sound control unit
outputs a voice imitating a human voice or a human voice at the sound output unit. According to
the present invention, the user can hear a human voice or a voice imitating a human voice.
[0010]
Further, according to the present invention, in the display device described above, the sound
09-05-2019
3
control unit converts sound data acquired by the data acquisition unit into sound data including a
component of a preset frequency range, and is based on the converted sound data. Outputting a
sound by the sound output unit. According to the present invention, since frequency components
included in sound data can be converted, for example, processing such as increasing a
component in a frequency band easy to be heard by the user, allowing a human voice to be heard
louder than noise, etc. .
[0011]
Furthermore, the present invention is characterized in that, in the display device, the sound
output unit is a headphone worn by the user. According to the present invention, there is an
advantage that the sound output to the user is less likely to leak around.
[0012]
Furthermore, the present invention is characterized in that the display device includes a sound
detection unit, and the data acquisition unit acquires data of the sound collected by the sound
detection unit. According to the present invention, environmental sound can be acquired and the
acquired sound can be output.
[0013]
Further, according to the present invention, in the display device described above, the sound
control unit detects the sound emitted by the object detected by the object detection unit by the
sound detection unit, and the same content as the detected sound or the detected sound
Outputting at the sound output unit. According to the present invention, a detected sound or a
sound having the same content as the detected sound can be output.
[0014]
In the display device according to the present invention, the sound output unit changes
frequency components of sound data acquired by the data acquisition unit based on the distance
detected by the distance detection unit. According to the present invention, the frequency
09-05-2019
4
component of the sound to be output can be changed based on the distance from the object.
[0015]
In addition, in order to achieve the above object, the present invention is provided with a display
unit for transmitting an outside scene and displaying an image so as to be visible with the outside
scene, and a sound output unit for outputting sound. A control method of a display device
mounted and used, comprising: detecting an object in a gaze direction of the user; detecting a
distance between the detected object and the user; and detecting the distance according to the
detected distance. Controlling the output of sound. According to the present invention, sound can
be output in response to external factors of the display device.
[0016]
In addition, in order to achieve the above object, the present invention is provided with a display
unit for transmitting an outside scene and displaying an image so as to be visible with the outside
scene, and a sound output unit for outputting sound. A computer-executable program for
controlling a display device mounted and used, the computer comprising: a target detection unit
for detecting a target object in the gaze direction of the user; and a target in the gaze direction of
the user Object detection unit for detecting an object, a distance detection unit for detecting the
distance between the object detected by the object detection unit and the user, and sound of the
sound output unit according to the distance detected by the distance detection unit Is a program
for functioning as a sound control unit that controls the output of According to the present
invention, sound can be output in response to external factors of the display device.
[0017]
It is explanatory drawing which shows the external appearance structure of a head-worn-type
display apparatus. FIG. 2 is a block diagram showing a functional configuration of a head
mounted display. It is a flowchart which shows operation | movement of a head-mounted-type
display apparatus. It is a flowchart which shows an audio | voice output process in detail. It is
explanatory drawing which shows the typical application example of a head mounted display, (A)
is a schematic diagram which shows the structure of the theater where a head mounted display is
utilized, (B) and (C) Shows an example of the user's field of view.
09-05-2019
5
[0018]
FIG. 1 is an explanatory view showing the appearance of a head-mounted display device 100. As
shown in FIG. The head-mounted display device 100 is a display device mounted on the head,
and is also called a head mounted display (HMD). The head-mounted display device 100
according to the present embodiment is an optical transmission head-mounted display device
that allows the user to view a virtual image and also to directly view an outside scene. In the
present specification, a virtual image visually recognized by the user by the head-mounted
display device 100 is also referred to as a “display image” for convenience. Further, emitting
an image light generated based on image data is also referred to as “displaying an image”.
[0019]
The head-mounted display device 100 includes an image display unit 20 that allows the user to
visually recognize a virtual image in a state of being worn on the head of the user, and a control
device 10 that controls the image display unit 20. The control device 10 also functions as a
controller for the user to operate the head-mounted display device 100. The image display unit
20 is also simply referred to as a "display unit".
[0020]
The image display unit 20 is a mounting body to be mounted on the head of the user, and in the
present embodiment, has an eyeglass shape. The image display unit 20 includes a right holding
unit 21, a right display driving unit 22, a left holding unit 23, a left display driving unit 24, a
right optical image display unit 26, a left optical image display unit 28, and a camera 61.
(Imaging unit) and a microphone 63 are provided. The right optical image display unit 26 and
the left optical image display unit 28 are respectively disposed in front of the user's right and left
eyes when the user wears the image display unit 20. One end of the right optical image display
unit 26 and one end of the left optical image display unit 28 are connected to each other at a
position corresponding to the eyelids of the user when the user wears the image display unit 20.
[0021]
09-05-2019
6
The right holding unit 21 extends from the end ER, which is the other end of the right optical
image display unit 26, to a position corresponding to the side of the user when the user wears
the image display unit 20. Member. Similarly, the left holding unit 23 extends from the end EL,
which is the other end of the left optical image display unit 28, to a position corresponding to the
user's temporal region when the user wears the image display unit 20. It is a member provided.
The right holding unit 21 and the left holding unit 23 hold the image display unit 20 on the head
of the user, like temples of glasses.
[0022]
The right display drive unit 22 and the left display drive unit 24 are disposed on the side facing
the head of the user when the user wears the image display unit 20. Hereinafter, the right
holding unit 21 and the left holding unit 23 are collectively referred to simply as the “holding
unit”, and the right display driving unit 22 and the left display driving unit 24 are collectively
referred to simply as the “display driving unit”. The right optical image display unit 26 and the
left optical image display unit 28 are collectively referred to simply as “optical image display
unit”.
[0023]
The display driving units 22 and 24 include liquid crystal displays 241 and 242 (Liquid Crystal
Displays, hereinafter also referred to as “LCDs 241 and 242”), projection optical systems 251
and 252, and the like (see FIG. 2). Details of the configuration of the display drive units 22 and
24 will be described later. The optical image display units 26 and 28 as optical members include
light guide plates 261 and 262 (see FIG. 2) and a light control plate 20A. The light guide plates
261 and 262 are made of a light transmissive resin or the like, and guide the image light output
from the display drive units 22 and 24 to the eyes of the user. The light control plate 20A is a
thin plate-like optical element, and is disposed so as to cover the front side of the image display
unit 20 which is the side opposite to the eye side of the user. As the light control plate 20A,
various ones are used such as one having almost zero light transmittance, one near transparency,
one transmitting light by attenuating the light quantity, and one attenuating or reflecting light of
a specific wavelength. Can. By appropriately selecting the optical characteristics (such as light
transmittance) of the light control plate 20A, the amount of external light incident on the right
optical image display unit 26 and the left optical image display unit 28 from the outside is
adjusted to visually recognize the virtual image. You can adjust the ease. In the present
embodiment, a case will be described where at least a light control plate 20A having a light
transmissivity that allows a user wearing the head-mounted display device 100 to visually
09-05-2019
7
recognize the scenery outside. The light control plate 20A protects the right light guide plate 261
and the left light guide plate 262, and suppresses the damage of the right light guide plate 261
and the left light guide plate 262, adhesion of dirt and the like. The light control plate 20A may
be attachable to and detachable from the right optical image display unit 26 and the left optical
image display unit 28. Alternatively, the light control plate 20A may be replaced and mounted, or
may be omitted.
[0024]
The camera 61 is disposed at an end ER which is the other end of the right optical image display
unit 26. The camera 61 captures an outside scene which is an outside scene in the direction
opposite to the side of the user's eyes, and acquires an outside scene image. The camera 61 of
the present embodiment shown in FIG. 1 is a monocular camera, but may be a stereo camera. The
shooting direction of the camera 61, that is, the angle of view is the front side direction of the
head mounted display 100, in other words, the outside view of at least a part of the user's view
direction in the state where the head mounted display 100 is mounted. Direction. Further,
although the width of the angle of view of the camera 61 can be set as appropriate, the imaging
range of the camera 61 includes the outside world (outside view) viewed by the user through the
right optical image display unit 26 and the left optical image display unit 28. It is preferable that
it is a range. Furthermore, it is more preferable that the imaging range of the camera 61 is set so
as to be able to capture the entire field of view of the user through the light control plate 20A.
[0025]
The image display unit 20 further includes a connection unit 40 for connecting the image display
unit 20 to the control device 10. The connection unit 40 includes a main body cord 48 connected
to the control device 10, a right cord 42, a left cord 44, and a connecting member 46. The right
code 42 and the left code 44 are codes in which the main body code 48 is branched into two.
The right cord 42 is inserted into the housing of the right holding unit 21 from the tip end
portion AP in the extension direction of the right holding unit 21, and is connected to the right
display driving unit 22. Similarly, the left cord 44 is inserted into the housing of the left holding
unit 23 from the tip end portion AP in the extension direction of the left holding unit 23 and is
connected to the left display driving unit 24.
[0026]
09-05-2019
8
The connecting member 46 is provided at a branch point of the main body cord 48 and the right
cord 42 and the left cord 44 and has a jack for connecting the headphone plug 30. From the
headphone plug 30, a right headphone 32 and a left headphone 34 as a sound output unit
extend. A microphone 63 (sound detection unit) is provided in the vicinity of the headphone plug
30. The headphone plug 30 to the microphone 63 are combined into a single cord, and the cord
branches from the microphone 63 and is connected to each of the right headphone 32 and the
left headphone 34. The right headphone 32 and the left headphone 34 shown in FIG. 1 are inner
ear headphones fitted to the pinna of the user. This is an example, and the right headphone 32
and the left headphone 34 may output sound. The sound output by the right headphone 32 and
the left headphone 34 under the control of the sound processing unit 170 and the information
output control unit 179 described later is, for example, a human voice or a voice imitating a
human voice. It is not limited. The sounds output from the right headphone 32 and the left
headphone 34 may be sounds in the audible range that the user can recognize by hearing,
including animal sounds, mechanical sounds, and any other sounds. In the present embodiment,
an example of outputting human voice or voice imitating human voice as described above will be
described. In addition, if the user can listen to the voice and can output the voice so that it is
difficult to hear the voice around the user, the head mounted display can be performed without
affecting the person around the user. It is preferable because only the user of the device 100 can
hear the sound. For example, the right headphone 32 and the left headphone 34 may be, in
addition to the inner ear type, a canal type headphone, an ear-hook type headphone, a closed
type headphone covering an ear pin of the user, or an open air type headphone. Furthermore,
instead of the right headphone 32 and the left headphone 34 or in addition to the right
headphone 32 and the left headphone 34, a bone conduction speaker may be provided in the
head-mounted display device 100. In this case, it is preferable that the bone conduction speaker
be provided on either or both of the right holding portion 21 and the left holding portion 23
which are easily in contact with the head of the user.
[0027]
The specific specification of the microphone 63 is arbitrary, and may be a directional
microphone or a nondirectional microphone. Examples of the microphone having directivity
include a single directivity (Cardioid), a narrow directivity (Supercardioid), a sharp directivity
(Hypercardioid), and a superdirectivity (Ultra Cardioid). In the case where the microphone 63 has
directivity, the voice from the direction of the line of sight of the user wearing the head-mounted
display device 100 may be particularly well collected and detected. In this case, in order to
secure the directivity of the microphone 63, structural features may be given to the microphone
63 or a component that accommodates the microphone 63. For example, in the example of FIG.
1, with the user wearing the right headphone 32 and the left headphone 34, the microphone 63
09-05-2019
9
and the connecting member 46 are designed such that the sound collecting portion of the
microphone 63 faces the user's gaze direction. It should just be. Alternatively, the microphone 63
may be embedded in the right holding unit 21 or the left holding unit 23. In this case, if a sound
collecting hole is formed on the front side of the right holding unit 21 or the left holding unit 23,
that is, on the surface aligned with the right optical image display unit 26 and the left optical
image display unit 28, the line of sight of the user Can have directivity corresponding to The lineof-sight direction of the user is, for example, the direction in which the right optical image
display unit 26 and the left optical image display unit 28 face, and the center of the view seen by
the user through the right optical image display unit 26 and the left optical image display unit 28
In other words, it can be rephrased as the direction facing the camera, the shooting direction of
the camera 61 or the like. The direction of the directivity of the microphone 63 may be variable.
In this case, the line of sight direction of the user may be detected, and the line of sight direction
of the microphone 63 may be adjusted to face the direction.
[0028]
It is also possible to combine the right code 42 and the left code 44 into a single code.
Specifically, the lead wire inside the right cord 42 is drawn into the left holding portion 23 side
through the inside of the main body of the image display unit 20 and covered with resin together
with the lead wire inside the left cord 44 and put together into one cord It is also good.
[0029]
The image display unit 20 and the control device 10 transmit various signals via the connection
unit 40. At each end of the main body cord 48 opposite to the connecting member 46 and the
control device 10, a connector (not shown) is provided to be fitted to each other. When the
connector of the main body cord 48 and the connector of the control device 10 are fitted and
released, the control device 10 and the image display unit 20 are connected or disconnected. For
the right cord 42, the left cord 44, and the main body cord 48, for example, a metal cable or an
optical fiber can be adopted.
[0030]
The control device 10 is a device for controlling the head mounted display 100. The control
device 10 includes switches including the determination key 11, the lighting unit 12, the display
09-05-2019
10
switching key 13, the luminance switching key 15, the direction key 16, the menu key 17, and
the power switch 18. Further, the control device 10 includes a track pad 14 which the user
performs touch operation with a finger.
[0031]
The determination key 11 detects a pressing operation and outputs a signal for determining the
content operated by the control device 10. The lighting unit 12 notifies the operating state of the
head-mounted display device 100 by the light emission state. As an operation state of the headmounted display device 100, for example, there is ON / OFF of the power supply, and the like.
For example, an LED (Light Emitting Diode) is used as the lighting unit 12. The display switching
key 13 detects a pressing operation, and outputs, for example, a signal for switching the display
mode of the content moving image between 3D and 2D.
[0032]
The track pad 14 detects the operation of the user's finger on the operation surface of the track
pad 14 and outputs a signal according to the detected content. As the track pad 14, various track
pads such as an electrostatic type, a pressure detection type, and an optical type can be adopted.
The brightness switching key 15 detects a pressing operation and outputs a signal to increase or
decrease the brightness of the image display unit 20. The direction key 16 detects the pressing
operation on the key corresponding to the up, down, left, and right directions, and outputs a
signal according to the detection content. The power switch 18 switches the power-on state of
the head-mounted display device 100 by detecting a slide operation of the switch.
[0033]
FIG. 2 is a functional block diagram of each part of the display system 1 according to the
embodiment. As shown in FIG. 2, the display system 1 includes an external device OA and a head
mounted display 100. Examples of the external device OA include a personal computer (PC), a
mobile phone terminal, a game terminal, and the like. The external device OA is used as an image
supply device that supplies an image to the head-mounted display device 100.
[0034]
09-05-2019
11
The control device 10 of the head-mounted display device 100 includes a control unit 140, an
operation unit 135, an input information acquisition unit 110, a storage unit 120, a power
supply 130, an interface 180, a transmission unit (Tx) 51, and And a transmitter (Tx) 52. The
operation unit 135 detects an operation by the user. The operation unit 135 includes the
decision key 11, the display switching key 13, the track pad 14, the brightness switching key 15,
the direction key 16, the menu key 17, and the power switch 18 shown in FIG.
[0035]
The input information acquisition unit 110 acquires a signal corresponding to an operation input
by the user. Examples of the signal corresponding to the operation input include operation input
to the track pad 14, the direction key 16, and the power switch 18. The power supply 130
supplies power to each part of the head-mounted display device 100. As the power source 130,
for example, a secondary battery can be used.
[0036]
The storage unit 120 stores various computer programs. The storage unit 120 is configured by a
ROM, a RAM, and the like. The storage unit 120 may store image data to be displayed on the
image display unit 20 of the head-mounted display device 100.
[0037]
The interface 180 is an interface for connecting to the control device 10 various external devices
OA that are sources of content. As the interface 180, for example, an interface compatible with
wired connection such as a USB interface, a micro USB interface, and an interface for a memory
card can be used.
[0038]
The control unit 140 realizes the functions of the respective units by reading out and executing
09-05-2019
12
the computer program stored in the storage unit 120. That is, the control unit 140 functions as
an operating system (OS) 150, an image processing unit 160, an audio processing unit 170, an
object detection unit 171, a distance detection unit 173, an information output control unit 179,
and a display control unit 190. . The three-axis sensor 113, the GPS 115, and the communication
unit 117 are connected to the control unit 140. The three-axis sensor 113 is a three-axis
acceleration sensor, and the control unit 140 can acquire a detection value of the three-axis
sensor 113. The GPS 115 includes an antenna (not shown), receives a GPS (Global Positioning
System) signal, and obtains the current position of the control device 10. The GPS 115 outputs
the current position and the current time obtained based on the GPS signal to the control unit
140. Also, the GPS 115 may have a function of acquiring the current time based on the
information included in the GPS signal, and correcting the time measured by the control unit 140
of the control device 10.
[0039]
The communication unit 117 executes wireless data communication conforming to a standard
such as wireless LAN (WiFi (registered trademark)), Miracast (registered trademark), Bluetooth
(registered trademark), and the like. When the external device OA is wirelessly connected to the
communication unit 117, the control unit 140 acquires content data from the communication
unit 117 and performs control for displaying an image on the image display unit 20. On the
other hand, when the external device OA is connected by wire to the interface 180, the control
unit 140 acquires content data from the interface 180 and performs control for displaying an
image on the image display unit 20. Therefore, the communication unit 117 and the interface
180 are hereinafter generically referred to as a data acquisition unit DA. The data acquisition
unit DA acquires content data to be displayed by the head-mounted display device 100 from the
external device OA. Content data can be various data such as image data and text data.
[0040]
The image processing unit 160 acquires an image signal included in the content. The image
processing unit 160 separates synchronization signals such as the vertical synchronization signal
VSync and the horizontal synchronization signal HSync from the acquired image signal. The
image processing unit 160 also generates a clock signal PCLK using a PLL (Phase Locked Loop)
circuit (not shown) or the like according to the cycle of the separated vertical synchronization
signal VSync or horizontal synchronization signal HSync. The image processing unit 160
converts an analog image signal from which the synchronization signal has been separated into a
digital image signal using an A / D conversion circuit or the like (not shown). Thereafter, the
09-05-2019
13
image processing unit 160 stores the converted digital image signal in the DRAM in the storage
unit 120 frame by frame as image data (Data in the drawing) of the target image. This image data
is, for example, RGB data. Note that the image processing unit 160 may execute image processing
such as resolution conversion processing, various color tone correction processing such as
adjustment of luminance and saturation, and keystone correction processing on image data, as
necessary. .
[0041]
The image processing unit 160 transmits the generated clock signal PCLK, the vertical
synchronization signal VSync, the horizontal synchronization signal HSync, and the image data
Data stored in the DRAM in the storage unit 120 through the transmission units 51 and 52. Do.
Note that the image data Data transmitted via the transmission unit 51 is also referred to as
“image data for right eye”, and the image data Data transmitted via the transmission unit 52 is
also referred to as “image data for left eye”. The transmitters 51 and 52 function as
transceivers for serial transmission between the control device 10 and the image display unit 20.
[0042]
The display control unit 190 generates control signals for controlling the right display drive unit
22 and the left display drive unit 24. Specifically, the display control unit 190 drives ON / OFF of
the right LCD 241 by the right LCD control unit 211, drives ON / OFF of the right backlight 221
by the right backlight control unit 201, and controls the left LCD control unit. The drive on / off
of the left LCD 242 by 212 and the drive on / off of the left backlight 222 by the left backlight
control unit 202 are individually controlled. Thereby, the display control unit 190 controls the
generation and emission of the image light by each of the right display drive unit 22 and the left
display drive unit 24. For example, the display control unit 190 causes both the right display
drive unit 22 and the left display drive unit 24 to generate image light, only one of them to
generate image light, or neither of them to generate image light.
[0043]
Display control unit 190 transmits control signals for right LCD control unit 211 and left LCD
control unit 212 via transmission units 51 and 52, respectively. Further, the display control unit
190 transmits control signals to the right backlight control unit 201 and the left backlight
09-05-2019
14
control unit 202, respectively.
[0044]
The image display unit 20 includes a right display drive unit 22, a left display drive unit 24, a
right light guide plate 261 as the right optical image display unit 26, a left light guide plate 262
as the left optical image display unit 28, and a camera 61. And a vibration sensor 65 and a 9-axis
sensor 66.
[0045]
The vibration sensor 65 is configured using an acceleration sensor, and is disposed inside the
image display unit 20 as shown in FIG.
In the example of FIG. 1, the right holding unit 21 is incorporated near the end portion ER of the
right optical image display unit 26. When the user performs an operation (knock operation) of
tapping the end ER, the vibration sensor 65 detects the vibration due to this operation and
outputs the detection result to the control unit 140. Based on the detection result of the vibration
sensor 65, the control unit 140 detects a knocking operation by the user.
[0046]
The nine-axis sensor 66 is a motion sensor that detects acceleration (three axes), angular velocity
(three axes), and geomagnetism (three axes). Since the 9-axis sensor 66 is provided in the image
display unit 20, when the image display unit 20 is mounted on the head of the user, the
movement of the head of the user is detected. Since the orientation of the image display unit 20
can be known from the detected movement of the head of the user, the control unit 140 can
estimate the gaze direction of the user.
[0047]
The right display driving unit 22 includes a receiving unit (Rx) 53, a right backlight (BL) control
unit 201 functioning as a light source and a right backlight (BL) 221, a right LCD control unit
211 functioning as a display element, and a right An LCD 241 and a right projection optical
09-05-2019
15
system 251 are provided. The right backlight control unit 201 and the right backlight 221
function as light sources. The right LCD control unit 211 and the right LCD 241 function as
display elements. The right backlight control unit 201, the right LCD control unit 211, the right
backlight 221, and the right LCD 241 are collectively referred to as an "image light generation
unit".
[0048]
The receiving unit 53 functions as a receiver for serial transmission between the control device
10 and the image display unit 20. The right backlight control unit 201 drives the right backlight
221 based on the input control signal. The right backlight 221 is, for example, a light emitter
such as an LED or electroluminescence (EL). The right LCD control unit 211 drives the right LCD
241 based on the clock signal PCLK input via the reception unit 53, the vertical synchronization
signal VSync, the horizontal synchronization signal HSync, and the right-eye image data Data1. .
The right LCD 241 is a transmissive liquid crystal panel in which a plurality of pixels are
arranged in a matrix.
[0049]
The right projection optical system 251 is configured of a collimator lens that converts the image
light emitted from the right LCD 241 into parallel light beams. The right light guide plate 261 as
the right optical image display unit 26 guides the image light output from the right projection
optical system 251 to the right eye RE of the user while reflecting it along a predetermined
optical path. In addition, the right projection optical system 251 and the right light guide plate
261 are collectively referred to as “light guide unit”.
[0050]
The left display drive unit 24 has a configuration similar to that of the right display drive unit 22.
The left display driving unit 24 includes a receiving unit (Rx) 54, a left backlight (BL) control unit
202 and a left backlight (BL) 222 functioning as light sources, a left LCD control unit 212
functioning as a display element and a left An LCD 242 and a left projection optical system 252
are provided. The left backlight control unit 202 and the left backlight 222 function as light
sources. The left LCD control unit 212 and the left LCD 242 function as display elements. The
left backlight control unit 202, the left LCD control unit 212, the left backlight 222, and the left
09-05-2019
16
LCD 242 are collectively referred to as "image light generation unit". In addition, the left
projection optical system 252 is configured by a collimator lens that converts the image light
emitted from the left LCD 242 into parallel light beams. The left light guide plate 262 as the left
optical image display unit 28 guides the image light output from the left projection optical
system 252 to the left eye LE of the user while reflecting it along a predetermined optical path.
The left projection optical system 252 and the left light guide plate 262 are also generically
referred to as a “light guide”.
[0051]
When the user is looking through the right optical image display unit 26 and the left optical
image display unit 28 and looks at the outside scene, the head-mounted display device 100
displays an image based on the content data so as to overlap the outside scene. indicate. The
target detection unit 171 performs control of causing the camera 61 to perform imaging, and
acquires a captured image. The captured image is output from the camera 61 as color image data
or monochrome image data. The camera 61 outputs an image signal, and the object detection
unit 171 generates image data conforming to a predetermined file format from the image signal.
May be The object detection unit 171 analyzes the acquired captured image data, and detects an
object appearing in the captured image data. The target is an object or a person present in the
imaging direction of the camera 61, that is, in the direction of the user's gaze. Further, the object
may include an object (including a surface such as a wall surface) on which an image or an image
is projected, or an image or an image projected on the object may be an object. Further, the
target detection unit 171 may detect a target from the image captured by the camera 61, and
may further detect the state of the target.
[0052]
The distance detection unit 173 obtains the distance to the target detected by the target
detection unit 171. For example, the distance detection unit 173 obtains the distance to the
target based on the size of the image of the target detected by the target detection unit 171 in
the captured image of the camera 61. When the object is a person or a thing that emits a sound,
the distance detection unit 173 has a function of detecting the distance to the sound source. In
addition, the head-mounted display device 100 may include a range finder that detects a distance
to an object using laser light or ultrasonic waves. This range finder includes, for example, a light
source of laser light and a light receiving unit which receives the reflected light of the laser light
emitted from the light source, and detects the distance to the object based on the light reception
state of the laser light. Further, the distance meter may be, for example, an ultrasonic distance
09-05-2019
17
meter. That is, a range finder may be used that includes a sound source that emits ultrasonic
waves and a detection unit that detects the ultrasonic waves reflected by the object, and detects
the distance to the object based on the reflected ultrasonic waves. Furthermore, this range finder
can be configured as a combination of a range finder using laser light and a range finder using
ultrasonic waves. Such a distance meter is preferably provided in the right holding unit 21 or the
right display drive unit 22 of the image display unit 20. For example, the distance meter may be
installed facing forward on a surface aligned with the light control plate 20A. The direction in
which the distance meter measures the distance is desirably the direction of the user's line of
sight, similar to the imaging direction of the camera 61. The distance detection unit 173 detects
the distance from the camera 61 or the distance meter to the object, and this distance can be
regarded as the distance from the user of the head-mounted display device 100 to the object.
[0053]
The information output control unit 179 (sound control unit) causes the image display unit 20 to
display an image based on the processing results of the object detection unit 171 and the
distance detection unit 173. The image displayed by the image display unit 20 is an image based
on the content data acquired by the data acquisition unit DA. For example, when the content data
includes moving images, still images, characters, symbols, etc., the information output control
unit 179 outputs the content data or a part of the data extracted from the content data to the
display control unit 190 and displays it. The control unit 190 displays an image. In addition, the
information output control unit 179 is an image based on the distance to the object detected by
the distance detection unit 173 with respect to the object detected by the object detection unit
171, the state of the object detected by the object detection unit 171, etc. You may control the
display attribute of characters and characters. The display attribute is, for example, when
displaying text data included in content data, the display size of characters, display color, font,
presence or absence of character decoration such as bold or italic, and the like. Moreover, it is
also possible to arrange an image of a shape such as a rectangle, an ellipse, or a circle as a
background of text data, and the presence / absence of the background, the size and shape of the
background, and the transparency of the background may be included in the display attributes.
Good. When displaying image data, the display attribute is, for example, the display size, display
color, transparency of the image, and the like.
[0054]
The information output control unit 179 also controls the audio output of the right headphone
32 and the left headphone 34. When the content data acquired by the data acquisition unit DA
09-05-2019
18
includes audio data (sound data), the information output control unit 179 generates an audio
signal based on the audio data and outputs the audio signal to the audio processing unit 170. If
the content data acquired by the data acquisition unit DA does not include audio data, the
information output control unit 179 acquires audio data based on the audio signal collected by
the microphone 63 and stores the audio data in the storage unit 120. Information output control
unit 179 generates an audio signal based on the audio data stored in storage unit 120 and
outputs the audio signal to audio processing unit 170. Here, the information output control unit
179 may output a sound related to the target detected by the target detection unit 171.
Furthermore, the information output control unit 179 may output sound in accordance with the
state of the target detected by the target detection unit 171. Further, the sound output by the
information output control unit 179 and output from the right headphone 32 and the left
headphone 34 may be any sound in the audible range that can be recognized by the user by
hearing, and simulates human voice or human voice. Includes voices, animal sounds, mechanical
sounds, and all other sounds. In this embodiment, an example of outputting human voice or voice
imitating human voice will be described. Furthermore, the information output control unit 179
may convert or adjust the audio data included in the content data in accordance with the human
hearing characteristic. Similarly, voice data of voice collected by the microphone 63 may be
converted or adjusted. The audible range of human beings is known to be approximately 20 Hz
to 20,000 Hz, but it is known that individual differences are large, and furthermore, high
frequency range becomes inaudible due to the influence of aging. Also, sounds in the frequency
range that many people find easy to hear are perceived well even if the sound pressure is low.
Therefore, the information output control unit 179 adjusts the audio data so as to be data of
audio including the largest number of sounds in the frequency band of 1,000 Hz to 8,000 Hz.
The frequency band is more preferably 1,000 Hz to 4,000 Hz, still more preferably 2,500 Hz to
4,000 Hz, and it is preferable that the component in the vicinity of 3,000 Hz is most abundant. In
addition, when the age of the user of the head-mounted display device 100 is set or input, the
information output control unit 179 sets the sound output from the right headphone 32 and the
left headphone 34 according to the user's age. Audio data may be converted or adjusted to
change the ratio of frequency components.
Furthermore, the information output control unit 179 may execute noise reduction processing
for reducing noise components and / or sharpness processing for enhancing the sharpness of the
sound on the data of the sound collected by the microphone 63. In the noise reduction process,
the information output control unit 179 performs, for example, a process of removing or
reducing the component of the frequency band corresponding to the noise component. This
noise may be white noise, low frequency sound, human voice or animal voice. For example, in
order to make it easier for the user of the head-mounted display device 100 to listen to the
voices of people on stage during theatrical performance, processing is performed to remove
cheers of the audience from audio data collected by the microphone 63. You can also.
09-05-2019
19
[0055]
The audio processing unit 170 acquires an audio signal input by the information output control
unit 179, amplifies the audio signal, and outputs a speaker (not shown) in the right headphone
32 and a speaker in the left headphone 34 connected to the connecting member 46. Supply (not
shown). For example, when the Dolby (registered trademark) system is adopted, processing is
performed on the audio signal, and different sounds, for example, whose frequency or the like is
changed are output from each of the right headphone 32 and the left headphone 34.
[0056]
Further, the audio processing unit 170 acquires the audio collected by the microphone 63,
converts it into digital audio data, and performs processing related to the audio. The generated
voice data is output to the information output control unit 179. For example, the voice
processing unit 170 performs speaker recognition that identifies voices of a plurality of people
separately by identifying features of the obtained voice and modeling them, and identifies a
person who is speaking for each voice. May be
[0057]
The three-axis sensor 113, the GPS 115, and the communication unit 117 are connected to the
control unit 140. The three-axis sensor 113 is a three-axis acceleration sensor, and the control
unit 140 can detect the detection value of the three-axis sensor 113 and detect the movement of
the controller 10 and the direction of the movement. The GPS 115 includes an antenna (not
shown), receives a GPS (Global Positioning System) signal, and obtains the current position of the
control device 10. The GPS 115 outputs the current position and the current time obtained based
on the GPS signal to the control unit 140. Also, the GPS 115 may have a function of acquiring the
current time based on the information included in the GPS signal, and correcting the time
measured by the control unit 140 of the control device 10. The communication unit 117
executes wireless data communication conforming to a wireless LAN (WiFi (registered
trademark)) or Bluetooth (registered trademark) standard.
[0058]
09-05-2019
20
The interface 180 is an interface for connecting to the control device 10 various image supply
devices OA that are sources of content. The content supplied by the image supply device OA
includes a still image or a moving image, and may include audio. Examples of the image supply
device OA include a personal computer (PC), a mobile phone terminal, a game terminal, and the
like. As the interface 180, for example, a USB interface, a micro USB interface, an interface for a
memory card, or the like can be used. Here, it is also possible to connect the image supply device
OA to the control device 10 by a wireless communication line. In this case, the image supply
device OA performs wireless communication with the communication unit 117, and transmits
content data using a wireless communication technology such as Miracast (registered
trademark).
[0059]
FIG. 3 is a flowchart showing the operation of the head-mounted display device 100, and in
particular, shows the process of using the function of the information output control unit 179.
Note that FIG. 3 exemplifies an operation in which the head-mounted display device 100 displays
text and images based on content data and outputs sound. First, the control unit 140 acquires
content data by the data acquisition unit DA, and stores the acquired data in the storage unit 120
(step S11).
[0060]
Subsequently, the target detection unit 171 causes the camera 61 to perform imaging to acquire
a captured image (step S12), and detects an image of a target from the captured image (step
S13). The process in which the target detection unit 171 detects a target is, for example, two. The
first process is a process in which data indicating the feature of the image of the object to be
detected is stored in advance in the storage unit 120, and this data is used. Specifically, the target
detection unit 171 acquires data indicating the feature of the image from the storage unit 120,
and searches the captured image for a portion matching the feature. The object detected in the
first process is an object that matches data stored in advance in the storage unit 120. In the
second process, when the object detection unit 171 extracts an outline in the captured image, the
image of a person or an object appearing in the captured image is cut out, and an image of a
predetermined size or more is cut out. Is a process of setting the captured image as the image of
the object. When images of a plurality of objects are detected in the first and second processes,
the object detection unit 171 may select one object closer to the user's gaze direction. For
example, an image of an object close to the center in the captured image of the camera 61 may
09-05-2019
21
be selected.
[0061]
Next, the distance detection unit 173 executes distance detection processing for detecting the
distance to the object detected by the object detection unit 171 (step S14). Furthermore, the
information output control unit 179 executes display processing, extracts a part of the content
data acquired in step S11, and displays an image on the image display unit 20 (step S15).
[0062]
Here, the information output control unit 179 determines whether the distance to the object
detected by the distance detection unit 173 is longer than the set value (step S16). Then, if the
detected distance is longer than the set value (step S16; YES), audio output processing is
executed (step S17), audio is output from the right headphone 32 and the left headphone 34, and
the process proceeds to step S18. Details of the audio output process will be described later with
reference to FIG. The setting value is a value that is set in advance as a reference for outputting
voice and stored in the storage unit 120. If the distance detected by the distance detection unit
173 is equal to or less than the set value (step S16; NO), the process proceeds to step S18
without executing the audio output process.
[0063]
In step S18, the control unit 140 determines whether all the content data acquired in step S1 has
been processed in steps S12 to S16. If there is unprocessed data (step S18; NO), the control unit
140 returns to step S12, and performs processing on the unprocessed data. If all data have been
processed (step S18; YES), the control unit 140 determines whether to end the display (step S19).
When the display is continued (step S19; NO), the control unit 140 returns to step S11. When the
display is ended according to the operation detected by the operation unit 135 (step S19; YES),
the control unit 140 stops the display of the image display unit 20 and ends the present process.
[0064]
09-05-2019
22
FIG. 4 is a flowchart showing the audio output processing in detail. The information output
control unit 179 determines whether the content data acquired by the data acquisition unit DA
includes audio data (step S21). If the content data includes audio data (step S21; YES), the
information output control unit 179 extracts audio data related to the target detected by the
target detection unit 171 from the content data (step S22). The information output control unit
179 generates an audio signal based on the audio data extracted in step S22 and outputs the
audio signal to the audio processing unit 170, and outputs audio from the right headphone 32
and the left headphone 34 (step S23).
[0065]
On the other hand, when the content data does not include audio data (step S21; NO), the
information output control unit 179 acquires audio data generated based on the audio collected
by the microphone 63 (step S24). The information output control unit 179 generates an audio
signal based on the audio data acquired in step S24 and outputs the audio signal to the audio
processing unit 170, and outputs audio from the right headphone 32 and the left headphone 34
(step S23). If it is determined in step S22 that the audio data included in the content data does
not include audio data related to the object, the process may move to step S24. In addition,
during the operation of the head-mounted display device 100, the sound processing unit 170
may generate sound data based on the sound collected by the microphone 63 and store the
sound data in the storage unit 120. In this case, the information output control unit 179 may
obtain the audio data generated by the audio processing unit 170 from the storage unit 120 in
step S24.
[0066]
FIG. 5 is an explanatory view showing a typical application example of the head-mounted display
device 100. As shown in FIG. FIG. 5A is a schematic view showing the configuration of a theater
TH in which the head-mounted display device 100 is used, and FIGS. 5B and 5C show the headmounted display device 100 in the theater TH. An example of a user's visual field VR used is
shown. The visual field VR is a visual field seen by the user through the right optical image
display unit 26 and the left optical image display unit 28 of the image display unit 20. Since the
image display unit 20 has a characteristic that it can be seen through the outside scene, the stage
ST can be seen in the visual field VR. The field of view VR includes curtains CT disposed at the
upper side and the left and right ends of the stage ST, and stage sleeves SS at the left and right
sides of the stage ST. The appearance of performer A is visible on the stage ST. In this example,
the performer A is on the stage ST, and the user looks at the performer A.
09-05-2019
23
[0067]
The theater TH shown in FIG. 5A has a configuration in which a large number of seats SH for a
spectator including the user of the head-mounted display device 100 are arranged to face the
stage ST. The user of the head-mounted display device 100 uses the head-mounted display
device 100 when sitting on the seat SH and viewing the stage ST.
[0068]
For example, FIG. 5B shows the visual field VR of the user when the user is seated on the seat
SH1 close to the stage ST. The visual field VR includes a relatively narrow range centered on the
appearance of the performer A at the center of the stage ST. In this example, the content data
acquired by the data acquisition unit DA includes text data on a theater program to be staged in
the theater TH. More specifically, text data of the whole theater program or a part of the theater
program is acquired in step S11 (FIG. 3), including the text spoken by the performer A and the
explanation of the theater program. For example, when the text data includes texts of a plurality
of lines and / or a plurality of descriptions, the information output control unit 179 divides the
content data acquired in step S11 into a plurality of times in accordance with the progress of the
play program. May be displayed. In this case, the content data may include text data and data
indicating timing for displaying the text data. In the example of FIG. 5 (B), the text 311 is
displayed based on the text data of the speech of the performer A. When the user is seated at the
seat SH1, the user can directly hear the voice emitted by the performer A. Therefore, the headmounted display device 100 does not output sound so as not to affect the theater.
[0069]
FIG. 5C shows, for example, the visual field VR of the user when the user is seated at the seat SH2
located away from the stage ST. The visual field VR covers a wide range including the curtain CT
and the stage sleeves SS on both sides of the stage centering on the appearance of the performer
A at the center of the stage ST. In the example of FIG. 5 (C), the text 311 is displayed based on
the text data which is the speech of the performer A.
[0070]
09-05-2019
24
When the user is seated at the seat SH2, the distance from the head-mounted display device 100
to the performer A is longer than the set value (step S16 in FIG. 3). Therefore, the information
output control unit 179 outputs the sound based on the sound data included in the content data
acquired by the data acquisition unit DA or the sound data of the sound collected by the
microphone 63.
[0071]
In this example, the object detection unit 171 detects performer A as an object in the direction of
the user's line of sight. When the content data includes voice data, the information output control
unit 179 extracts voice data related to the performer A and outputs voice. When the microphone
63 has directivity as described above, the sound heard from the stage ST is collected. For this
reason, the sound collected by the microphone 63 is a sound related to the stage ST, and when
the performer A is speaking, this voice is collected. Therefore, the information output control unit
179 outputs a sound based on the sound data of the sound collected by the microphone 63. The
user of the seat SH2 away from the stage ST can not or can not hear the voice of the performer A
or the like. In such a situation, the head-mounted display device 100 can output the voice of the
performer A to the user. Since the head-mounted display device 100 outputs sound from the
right headphone 32 and the left headphone 34, the sound to be output can not be heard by other
spectators in the theater TH, and the surrounding people do not feel uncomfortable. In this case,
the information output control unit 179 executes a noise reduction process to reduce the sound
other than the voice of the performer A, for example, the voice of the audience, from the data of
the sound collected by the microphone 63. It is also good.
[0072]
Furthermore, the information output control unit 179 can also output sound in accordance with
the state of the object detected by the object detection unit 171. In the example of FIG. 5C, the
object detection unit 171 detects the performer A who is the object from the captured image, and
detects whether the mouth of the performer A is open as the object state. You may In this case,
the information output control unit 179 acquires, from the target detection unit 171, data
indicating whether or not the performer's A mouth is open. The information output control unit
179 outputs audio from the right headphone 32 and the left headphone 34, for example, when
the performer A's mouth is open. In this case, it is possible to listen to the user of the headmounted display device 100 a natural sound close to a state in which the sound emitted by the
09-05-2019
25
performer A is being heard. In addition, the information output control unit 179 may analyze the
shape and movement of the performer A's mouth and adjust the timing of outputting the voice. In
this case, the information output control unit 179 may adjust the timing of outputting the sound
from the right headphone 32 and the left headphone 34 in accordance with the distance detected
by the distance detection unit 173. For example, when the user is seated in the seat SH away
from the stage ST, it takes time for the voice of the performer A on the stage ST to reach the user.
Therefore, the distance detection unit 173 may determine the output timing of the sound based
on the captured image data, and may further correct the output timing of the sound based on the
distance detected by the distance detection unit 173.
[0073]
In addition, the information output control unit 179 extracts the voice of a specific person (for
example, the performer A) from the voice collected by the microphone 63 and emphasizes the
extracted voice to use the right headphone 32 and the left headphone 34. You may output it. If
the microphone 63 has directivity, the collected voice is analyzed to separate the voice for each
person, and the voice of the person with the largest volume may be extracted, and the specific
voice may be extracted based on the image captured by the camera 61. You may extract. Further,
the information output control unit 179 may analyze the content of the sound based on the
sound collected by the microphone 63, and adjust or change the data of the collected sound
based on the content of the sound. For example, when the voice collected by the microphone 63
is a human voice (for example, a singer of an opera singer, etc.), the information output control
unit 179 causes the volume of a specific frequency band to be easily heard. Process such as
emphasizing Also, when the content of the collected sound is a music having a wide dynamic
range, the sound data may be processed to emphasize high and low tones. Furthermore, when
the right headphone 32 and the left headphone 34 have a bone conduction speaker or a
subwoofer, the volume of the frequency band suitable for the bone conduction speaker or the
subwoofer may be amplified.
[0074]
In addition, in the environment where the head-mounted display device 100 is used (for example,
theatre TH), when the sound is output by a speaker or the like, the sound that the user is
listening to, such as the performance sound of an instrument When sound can be collected in the
environment of the device 100, the information output control unit 179 may output the sound in
coordination with the sounds in the environment. For example, the information output control
unit 179 converts and adjusts audio data, and / or the right headphone 32, the left headphone,
09-05-2019
26
etc., reflecting the reverberation time and the like corresponding to the distance to the sound
source detected by the distance detection unit 173. The output timing of the sound of 34 may be
adjusted. In this case, the user can hear high-quality sound with a sense of presence, even at a
position away from the sound source. Furthermore, the information output control unit 179 may
emphasize and output the audio data acquired by the data acquisition unit DA with the
environmental sound collected by the microphone 63. Alternatively, the information output
control unit 179 analyzes the balance comfortable for the individual user, and makes the sound
data of the sound collected by the microphone 63 and the sound data acquired by the data
acquisition unit DA suitable for the user. It may be adjusted or converted. In addition, when
synthesizing voice data of voice collected by the microphone 63 and voice data acquired by the
data acquisition unit DA, the ratio of the volume of each frequency component etc. is adjusted to
be voice suitable for the user. May be The distribution at the time of synthesis may be
automatically analyzed and determined, may be set in advance, or may be set by the user
operating the control device 10.
[0075]
As shown in FIG. 1, the head-mounted display device 100 has a configuration in which the right
headphone 32 and the left headphone 34 are separated from the right holding unit 21 and the
left holding unit 23. Therefore, the user can wear either the right headphone 32 or the left
headphone 34 and listen to the environmental sound with the other ear. In this case, the
information output control unit 179 may output monaural sound to the headphone worn by the
user, and switches to stereo audio when the user wears both the right headphone 32 and the left
headphone 34. May be output.
[0076]
As described above, the head-mounted display device 100 according to the embodiment to which
the present invention is applied is used by being attached to the body of the user, transmits the
outside scene, and displays the image so as to be visible with the outside scene. An image display
unit 20 is provided. The head-mounted display device 100 also includes a right headphone 32
and a left headphone 34 that output sound. Furthermore, the target detection unit 171 that
detects an object in the direction of the line of sight of the user, the distance detection unit 173
that detects the distance between the detected object and the user, and the right headphone 32
and the left headphone according to the detected distance. And 34 an information output control
unit 179 that controls the output of the voice. As a result, it is possible to output sound
corresponding to an external factor of the head-mounted display device 100.
09-05-2019
27
[0077]
In addition, the head-mounted display device 100 includes a data acquisition unit DA that
acquires audio data related to an object. The information output control unit 179 outputs the
sound acquired by the data acquisition unit DA by the right headphone 32 and the left
headphone 34 when the distance detected by the distance detection unit 173 is longer than the
preset distance. For this reason, it is possible to control the audio output in accordance with the
distance between the head-mounted display device 100 and the object. In addition, the headmounted display device 100 includes the camera 61 configured to capture the direction of the
user's line of sight, and the object detection unit 171 is configured to visually recognize the user
through the image display unit 20 Detect the target object. Therefore, it is possible to detect an
object in the direction of the line of sight of the user based on the captured image.
[0078]
Further, the object detection unit 171 detects the state of the object that the user visually
recognizes through the image display unit 20 based on the captured image of the camera 61, and
the information output control unit 179 is acquired by the data acquisition unit DA. The sound
thus generated is output from the right headphone 32 and the left headphone 34 in accordance
with the state of the object detected by the object detection unit 171. Therefore, it is possible to
output sound corresponding to the state of the object detected from the captured image. Further,
by using the right headphone 32 and the left headphone 34 worn by the user as the audio output
unit, there is an advantage that the audio output to the user is unlikely to leak around. In
addition, the head-mounted display device 100 includes a microphone 63 that detects sound. The
information output control unit 179 detects the sound emitted by the object detected by the
object detection unit 171 by the microphone 63, and outputs the detected sound or a sound
having the same content as the detected sound by the right headphone 32 and the left
headphone 34. . Thus, the head-mounted display device 100 can output a voice having the same
content as the detected voice or the detected voice.
[0079]
The present invention is not limited to the configuration of the above embodiment, and can be
implemented in various modes without departing from the scope of the invention. For example,
09-05-2019
28
in the above embodiment, the information output control unit 179 determines whether to output
the sound based on the distance detected by the distance detection unit 173. However, the
present invention is not limited to this. Whether or not to output voice may be determined
according to the conditions set in advance. In the example of FIG. 5A, when the user of the headmounted display device 100 is seated at a position close to the stage ST, sound may be output by
the operation of the user. In addition, the information output control unit 179 may convert text
data included in the content data into voice data by text-to-speech processing, and may output
voice based on the voice data. Furthermore, the information output control unit 179 displays the
text or image displayed on the image display unit 20 while outputting the audio, in accordance
with the output of the audio when the text or image is related to the audio to be output. The
aspect may be changed. Specifically, when the text 311 shown in FIG. 5C is the speech of the
performer A and the speech of the speech of the performer A is output, the display attribute of
the text 311 may be changed during the audio output. . In this case, it is possible to give an effect
to the display of the text or the image according to the output state of the audio, and it is possible
to provide information on the audio being output.
[0080]
Further, in the above embodiment, the head mounted display 100 includes the control device 10
and the image display unit 20, and the configuration in which the head mounted display 100 is
connected by the main body cord 48 is exemplified. The present invention is not limited to this,
and the functional unit of the control device 10 may be integrally accommodated in the image
display unit 20. For example, the image display unit 20 accommodates the secondary battery of
the power source 130 (FIG. 2). In this case, the head-mounted display device 100 is apparently
only the image display unit 20 worn on the head of the user, but its function is sufficient for the
application of the present invention. An information processing apparatus having a processing
function, such as a personal computer, a smartphone, a tablet computer, and a wristwatch
computer, transmits data for display to the head-mounted display device 100, and the image
display unit 20 displays the data. It may be configured to receive and display the data of. In this
case, the head-mounted display device 100 functions as a display of the above-described
information processing device. Alternatively, a mode is also possible in which the head-mounted
display device 100 is used as an information processing device having a processing function. In
this case, the head-mounted display device 100 can realize various functions by the CPU of the
control unit 140 executing a program. Alternatively, the functions of the control device 10 and
the image display unit 20 may be arranged differently from the example of FIG. 2. For example,
the decision key 11, the display switching key 13, the track pad 14, and the operation unit 135
May be provided in each of the control device 10 and the image display unit 20.
09-05-2019
29
[0081]
Also, as the image display unit, instead of the image display unit 20, an image display unit of
another type such as an image display unit worn like a hat may be adopted, for example,
corresponding to the user's left eye It suffices to have a display unit for displaying an image and
a display unit for displaying an image corresponding to the right eye of the user. Further, the
display device of the present invention may be configured as, for example, a head mounted
display mounted on a vehicle such as an automobile or an airplane. Also, for example, it may be
configured as a head mount display built in a body protector such as a helmet, or may be a headup display (HUD) used for a windshield of a car.
[0082]
Furthermore, in the above embodiment, the image display unit 20 and the control device 10 are
separated, and the configuration in which the image display unit 20 and the control device 10
are connected via the connection unit 40 is described as an example. It can also be configured to
be worn on the head of the user. Further, the control device 10 and the image display unit 20 are
connected by a longer cable or wireless communication line, and as the control device 10, a
notebook computer, a tablet computer or a desktop computer, a game machine, a portable
telephone, a smartphone or Portable electronic devices including portable media players, other
dedicated devices, etc. may be used.
[0083]
Furthermore, for example, as a configuration for generating image light in the image display unit
20, a display of an organic EL (organic electro-luminescence, organic electro-luminescence) and
an organic EL control unit may be provided, or LCOS (Liquid crystal). It is also possible to use on
silicon, LCoS (registered trademark), digital micro mirror devices and the like. Further, for
example, the present invention can be applied to a head mounted display of laser retinal
projection type. That is, the image generation unit includes a laser light source and an optical
system for guiding the laser light source to the user's eye, and the laser light is made incident on
the user's eye to scan on the retina and form an image on the retina. The structure which makes
a user visually recognize an image may be adopted. When the laser retina projection type head
mounted display is adopted, the “image light emission possible area in the image light
generation unit” can be defined as an image area recognized by the eye of the user.
09-05-2019
30
[0084]
As an optical system for guiding the image light to the user's eye, an optical member that
transmits external light incident toward the device from the outside can be employed, and a
configuration in which the light is incident on the user's eye together with the image light can be
adopted. Alternatively, an optical member located in front of the user's eye and overlapping a
part or all of the user's field of view may be used. Furthermore, an optical system of a scanning
method in which laser light or the like is scanned to form image light may be adopted. Further,
the present invention is not limited to guiding the image light inside the optical member, and
may have only a function of guiding the image light toward the user's eyes by refraction and / or
reflection.
[0085]
The present invention can also be applied to a display device that employs a MEMS display
technology by employing a scanning optical system using a MEMS mirror. That is, as an image
display element, a scanning optical system having a signal light forming unit, a MEMS mirror for
scanning light emitted by the signal light forming unit, an optical member on which a virtual
image is formed by light scanned by the scanning optical system May be provided. In this
configuration, the light emitted from the signal light forming unit is reflected by the MEMS
mirror, enters the optical member, is guided through the optical member, and reaches the virtual
image forming surface. When the MEMS mirror scans light, a virtual image is formed on the
virtual image forming surface, and the user recognizes the virtual image by recognizing the
virtual image. The optical component in this case may be one that guides light through multiple
reflections, such as the right light guide plate 261 and the left light guide plate 262 in the above
embodiment, or may use a half mirror surface .
[0086]
Further, the display device of the present invention is not limited to a head-mounted display
device, and can be applied to various display devices such as a flat panel display and a projector.
The display device of the present invention may be any device that allows the image to be viewed
by the image light as well as the outside light, and, for example, a configuration in which the
image by the image light is viewed by the optical member that transmits the outside light.
Specifically, in addition to the configuration provided with the optical member that transmits
09-05-2019
31
external light in the above-mentioned head mount display, a translucent flat surface or curved
surface (glass (fixed or movable) installed fixedly or movably at a position away from the user
The present invention can also be applied to a display device that projects image light onto a
transparent plastic or the like. An example is a configuration of a display device that projects
image light on a window glass of a vehicle and allows a user who is on board or a user outside
the vehicle to visually recognize the scenery inside and outside the vehicle together with the
image by the image light. Be In addition, for example, the image light is projected on a fixed
transparent or translucent or colored transparent display surface such as a window glass of a
building, and the user who is around the display surface is displayed together with the image by
the image light There is a configuration of a display device which makes a face visible and makes
the landscape visible.
[0087]
Further, at least a part of each functional block shown in FIG. 2 may be realized by hardware, or
may be realized as a combination of hardware and software, as shown in FIG. The configuration
is not limited to the configuration in which independent hardware resources are allocated.
Further, the program executed by the control unit 140 may be stored in the storage unit 120 or a
storage device in the control device 10, or the program stored in an external device is acquired
via the communication unit 117 or the interface 180. It may be configured to execute. Further,
among the configurations formed in the control device 10, only the operation unit 135 may be
formed as a single user interface (UI), or the power supply 130 in the above embodiment is
formed separately and is replaceable. It may be a configuration. Further, the configuration
formed in the control device 10 may be formed in the image display unit 20 in an overlapping
manner. For example, the control unit 140 shown in FIG. 2 may be formed in both the control
device 10 and the image display unit 20, or the CPU formed in the control unit 140 and the
image display unit 20 formed in the control device 10 The function to be performed may be
divided separately.
[0088]
Reference Signs List 1 display system 10 control device 20 image display unit (display unit) 21
right holding unit 22 right display driving unit 23 left holding unit 24 left display driving unit 26
right optical Image display unit 28 Left optical image display unit 32 Right headphone (sound
output unit) 34 Left headphone (sound output unit) 61 Camera (imaging unit) 63 Microphone
(sound detection unit) 100 ... head mounted display (display), 117 ... communication unit, 120 ...
storage unit, 140 ... control unit, 150 ... operating system, 160 ... image processing unit, 170 ...
audio processing unit, 171 ... target detection unit, 173: distance detection unit, 179: information
output control unit (sound control unit), 180: interface, 190: display control unit, 201: right
09-05-2019
32
backlight control unit, 202: left backlight control unit, 211: right LCD control unit , 212 Left LCD
controller, 221: right backlight, 222: left backlight, 241: right LCD, 242: left LCD, 251: right
projection optical system, 252: left projection optical system, 261: right light guide plate, 262:
left Light guide plate, 311 ... text, DA ... data acquisition unit.
09-05-2019
33
Документ
Категория
Без категории
Просмотров
0
Размер файла
55 Кб
Теги
jp2016033764
1/--страниц
Пожаловаться на содержимое документа