close

Вход

Забыли?

вход по аккаунту

?

JP2014081727

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2014081727
Abstract: To provide an audio output system capable of providing a user with new play using
physically different audio output devices. An information processing apparatus includes: a signal
of a first voice to be output to a first output device; and a voice to be output to a second output
device, the second voice having a content different from the first voice. Generate a signal of The
second output device is capable of setting the volume, and outputs a sound based on the signal of
the second sound at the set volume. Further, the second output device transmits information of
the set volume to the information processing device. The information processing apparatus
generates the first audio signal such that at least a part of the audio included in the second audio
is included in the first audio when the set volume is low. [Selected figure] Figure 9
Voice output system, voice output program, voice output control method, and information
processing apparatus
[0001]
The present invention relates to an audio output system, an audio output program, an audio
output control method, and an information processing apparatus, and more particularly, an audio
output system using a plurality of different audio output devices, an audio output program, and
an audio output control method , And an information processing apparatus.
[0002]
Conventionally, a game system using a common television device (first video output device) and
another controller (second video output device) having a display unit capable of outputting video
09-05-2019
1
is known. ing.
In such a game system, for example, a first game video is displayed on a television device, and a
second game video different from the first game video is displayed on the display unit of the
controller, or the like. Was proposed.
[0003]
JP 2012-135337 A
[0004]
However, the above-mentioned proposal mainly aims at displaying what kind of image and
displaying them in cooperation with the game processing.
Therefore, no particular mention or suggestion was made regarding processing relating to
speech.
[0005]
Therefore, an object of the present invention is to provide an audio output system capable of
providing a user with new play using physically different audio output devices.
[0006]
The above object is achieved, for example, by the following configuration.
[0007]
An example of the configuration example is an audio output system including an information
processing apparatus, a first output device including a first audio output unit, and a second
output device including a second audio output unit.
The information processing apparatus is a first audio output signal for outputting to a first
09-05-2019
2
output device based on predetermined information processing, and an audio signal for outputting
to a second output device, the first audio signal being An audio generation unit is provided that
generates a second audio signal having different audio content.
The second output device outputs a second audio output unit based on the second audio signal at
the volume set by the volume setting unit for setting the volume and the volume set by the
volume setting unit. And a volume notification unit for transmitting information of the volume set
by the volume setting unit to the information processing apparatus. Then, when the volume set
by the volume setting unit is low, the audio generation unit generates the first audio signal such
that at least a part of the audio included in the second audio is included in the first audio. Do.
Here, the case where the volume set by the volume setting unit is small is a case where the
volume is small to such an extent that the user can not recognize the sound emitted from the
second audio output unit.
[0008]
According to the above configuration example, when there is a voice that the user wants to
recognize, such voice can be more surely heard by the user.
[0009]
As another configuration example, when the volume set by the volume setting unit is equal to or
less than the predetermined value, the audio generation unit is configured to include at least a
part of audio included in the second audio in the first audio. A first audio signal may be
generated.
Further, when the sound volume set by the sound volume setting unit is 0, the sound generation
unit is configured such that at least a part of the sound included in the second sound is included
in the first sound. It may be generated.
[0010]
As still another configuration example, the sound generation unit is configured to increase the
reproduction sound volume of the second sound included in the first sound signal as the sound
volume set by the sound volume setting unit decreases. The generation of the first audio signal
may be performed such that the reproduction volume of the second audio included in the first
audio signal decreases as the volume set in step (4) increases.
09-05-2019
3
[0011]
According to the above configuration example, even when the user can not hear the volume of
the second output device during execution of the predetermined information processing using
the voice output only from the second output device, the first output device The user can
recognize the voice through the
[0012]
In still another configuration example, the first output device may further include a first display
unit, and the second output device may further include a second display unit.
The information processing apparatus may further include an image generation unit that
generates a first image to be displayed on the first display unit and a second image to be
displayed on the second display unit.
[0013]
As still another configuration example, the volume notification unit includes an operation data
generation unit that generates operation data based on the user's operation, and the information
processing apparatus further includes a game processing unit that performs game processing
based on the operation data. It may be
Then, the operation data generation unit may generate operation data including information of
the volume set by the volume setting unit.
[0014]
As still another configuration example, the second voice may be voice that needs to be
recognized by the user reliably during execution of predetermined information processing.
[0015]
09-05-2019
4
According to the above configuration example, for example, in the game processing, it is possible
to provide a game using a rendering effect or the like that uses sound output only from the
second output device.
Then, in such a game process, even when the volume control of the second output device is
reduced by the user, it is possible to reduce the decrease in the sound effect of the game
intended by the creator.
[0016]
According to the present invention, it is possible to cause the user to more surely recognize such
a voice when the voice desired to be surely heard by the user is used in the predetermined
information processing.
[0017]
External view showing an example of the game system 1 according to an embodiment of the
present invention Functional block diagram showing an example of the game apparatus main
body 5 of FIG. 1 Internal view of the terminal device 6 showing an example of the terminal device
6 of FIG. A block diagram showing an example of the configuration A diagram showing an
example of an output state of a game voice A diagram showing an example of an output state of a
game voice An example of the configuration of the memory map terminal operation data 83 of
the memory 12 Flowchart showing the flow of the control process of the terminal device 6
[0018]
A game system according to an embodiment of the present invention will be described with
reference to FIG.
[0019]
In FIG. 1, a game system 1 includes a home television receiver (hereinafter referred to as a
monitor) 2 which is an example of display means, and a stationary game apparatus 3 connected
to the monitor 2 via a connection cord. Configured
The monitor 2 includes a speaker 2a.
09-05-2019
5
Further, the game apparatus 3 includes a game apparatus body 5 and a terminal device 6.
[0020]
The monitor 2 displays a game image output from the game apparatus body 5.
The monitor 2 has a speaker 2 a, and the speaker 2 a outputs game sound output from the game
apparatus body 5. In this example, the monitor 2 has the speaker 2a. However, in another
embodiment, an external speaker may be separately connected to the monitor 2 (via an amplifier
or the like). .
[0021]
The game apparatus body 5 executes game processing and the like based on a game program
and the like stored in an optical disc readable by the game apparatus body 5.
[0022]
The terminal device 6 is an input device that can be held by the user.
The user can hold and move the terminal device 6 or use the terminal device 6 at an arbitrary
position. The terminal device 6 includes an LCD (Liquid Crystal Display: liquid crystal display
device) 21 as a display unit, a speaker 23, a headphone terminal described later, an input unit
(analog stick, pressed button, touch panel, etc.) and the like. The terminal device 6 and the game
apparatus body 5 can communicate wirelessly (or may be wired). The terminal device 6 receives
data of an image (for example, a game image) generated by the game apparatus body 5 from the
game apparatus body 5 and displays an image indicated by the data on the LCD 61. In addition,
the terminal device 6 receives from the game apparatus body 5 data of sound (for example,
sound effects and BGM of a game) generated by the game apparatus body 5 and outputs a sound
indicated by the data from the speaker 23. Further, the terminal device 6 transmits, to the game
apparatus body 5, operation data representing the contents of the operation performed on the
own device.
09-05-2019
6
[0023]
FIG. 2 is a block diagram of the game apparatus body 5. In FIG. 2, the game apparatus body 5 is
an example of an information processing apparatus. In the present embodiment, the game
apparatus body 5 has a CPU (control unit) 11, a memory 12, a system LSI 13, a wireless
communication unit 14, an AV-IC (Audio Video-Integrated Circuit) 15, and the like.
[0024]
The CPU 11 executes a predetermined information processing program using the memory 12
and the system LSI 13 or the like. Thereby, various functions (for example, game processing) in
the game apparatus 3 are realized.
[0025]
The system LSI 13 includes a graphics processor unit (GPU) 16, a digital signal processor (DSP)
17, an input / output processor 18, and the like.
[0026]
The GPU 16 generates an image in accordance with a graphics command (drawing instruction)
from the CPU 11.
In the present embodiment, the game apparatus body 5 may generate both the game image
displayed on the monitor 2 and the game image displayed on the terminal device 6. Hereinafter,
the game image displayed on the monitor 2 may be referred to as “monitor game image”, and
the game image displayed on the terminal device 6 may be referred to as “terminal game
image”.
[0027]
The DSP 17 functions as an audio processor, and generates sound data using sound data and
09-05-2019
7
sound waveform (tone color) data stored in the memory 12. In the present embodiment, both
game sound output from the speaker 2a of the monitor 2 and game sound output from the
speaker 23 of the terminal device 6 may be generated for the game sound as well as the game
image. . Hereinafter, the game sound output from the monitor 2 may be referred to as “monitor
game sound”, and the game sound output from the terminal device 6 may be referred to as
“terminal game sound”.
[0028]
The input / output processor 18 executes data transmission / reception with the terminal device
6 via the wireless communication unit 14. In the present embodiment, the input / output
processor 18 transmits the game image generated by the GPU 16 (game image for terminal) and
the data of game sound generated by the DSP 17 (game sound for terminal) via the wireless
communication unit 14. Transmit to the terminal device 6. At this time, the terminal game image
may be compressed and transmitted so as not to cause a delay of the display image. Further, the
input / output processor 18 receives the operation data and the like transmitted from the
terminal device 6 via the wireless communication unit 14 and stores (temporarily stores) in the
buffer area of the memory 12.
[0029]
Further, among the images and sounds generated in the game apparatus body 5, the image data
and the sound data to be output to the monitor 2 are read by the AV-IC 15. The AV-IC 15 outputs
the read image data to the monitor 2 via an AV connector (not shown), and outputs the read
audio data to the speaker 2 a built in the monitor 2. As a result, an image is displayed on the
monitor 2 and sound is output from the speaker 2a.
[0030]
FIG. 3 is a diagram showing an example of an appearance configuration of the terminal device 6.
As shown in FIG. 3, the terminal device 6 includes a substantially plate-like housing 20. The
housing 20 has a size (shape) that can be gripped by the user with both hands or one hand. The
terminal device 6 also includes an LCD 21 which is an example of a display unit. The terminal
game image is displayed on the LCD 21.
09-05-2019
8
[0031]
The terminal device 6 also includes a speaker 23. The terminal game sound is output from the
speaker 23. The terminal device 6 further includes a volume adjustment slider 28 for adjusting
the volume of the volume of the speaker 23. The volume control slider 28 is a slider type switch.
[0032]
The terminal device 6 also includes a touch panel 22. The touch panel 22 is an example of a
position detection unit that detects a position input to a predetermined input surface (screen of
the display unit) provided in the housing 20. Furthermore, the terminal device 6 includes an
analog stick 25, a cross key 26, a button 27 and the like as an operation unit (the operation unit
31 shown in FIG. 4).
[0033]
FIG. 4 is a block diagram showing the electrical configuration of the terminal device 6. As shown
in FIG. 4, the terminal device 6 includes the LCD 21 described above, the touch panel 22, the
speaker 23, the volume adjustment slider 28, and the operation unit 31. Moreover, the motion
sensor 32 (for example, an acceleration sensor or a gyro sensor) for detecting the attitude |
position of the terminal device 6 is also provided.
[0034]
The terminal device 6 further includes a wireless communication unit 34 capable of wireless
communication with the game apparatus body 5. In the present embodiment, wireless
communication is performed between the terminal device 6 and the game apparatus body 5, but
in another embodiment, communication may be performed by wire.
[0035]
09-05-2019
9
The terminal device 6 also includes a control unit 33 that controls the operation of the terminal
device 6. Specifically, the control unit 33 receives output data from each input unit (the touch
panel 22, the operation unit 31, the motion sensor 32). Then, the received data is transmitted to
the game apparatus body 5 via the wireless communication unit 34 as operation data. Further,
when the terminal game image from the game apparatus body 5 is received by the wireless
communication unit 34, the control unit 33 performs appropriate processing as needed (for
example, decompression processing when image data is compressed). ) To display an image from
the game apparatus body 5 on the LCD 21. Furthermore, when the terminal game sound from
the game apparatus body 5 is received by the wireless communication unit 34, the control unit
33 outputs the terminal game sound to the speaker 23.
[0036]
Next, with reference to FIG. 5 and FIG. 6, the outline of the game processing executed in the
system of the present embodiment will be described.
[0037]
In the present embodiment, as shown in FIG. 5, it is assumed that game processing is performed
in which a predetermined game sound is basically output only as a terminal game sound.
This assumes that the game sound is divided between the speaker 2a of the monitor 2 and the
speaker 23 of the terminal device 6 from the viewpoint of the effect of the game. Also, when
playing a game, basically, the terminal device 6 held by the user is closer to the user (ear) than
the monitor 2. Therefore, by outputting the game sound only from the speaker 23, the user can
reliably recognize the game sound. For example, it is useful in the case where the user wants the
user to pay attention to the screen of the terminal device 6, or in the case where the user is
reminded of something or when the reality of the three-dimensional virtual game space is
emphasized. Hereinafter, the game sound output only as such a terminal game sound may be
referred to as a terminal dedicated sound.
[0038]
Examples of such terminal-dedicated speech are, for example: First, a game process using two
screens of the monitor 2 and the LCD 21 of the terminal device 6 is assumed. For example, the
user character and the enemy character are displayed on the monitor 2. On the other hand, it is
09-05-2019
10
assumed that the LCD 21 displays various status information such as hit points and life points of
the user character. In this game, the user basically plays while watching the monitor 2 (for
example, plays a game of fighting with a plurality of enemy characters in a one-to-many
situation). Then, for example, when the user character's hit point or life point falls significantly, a
sound effect (warning sound) for warning a pinch of the user character is output as a terminaldedicated voice. By this, the user is surely warned. Also, as another example, in the game
processing of moving in a virtual three-dimensional space, when it is raining in the virtual threedimensional space, the sound of rain is output as terminal-dedicated voice only on the terminal
device 6 side By doing this, it is possible to further enhance the realism of the virtual threedimensional space. Also, as another example, it is assumed that game processing is also
performed by moving in the virtual three-dimensional space and fighting with a plurality of
enemy characters. Then, in such a game, for example, it is assumed that there is an enemy
character (not displayed on the screen and invisible to the user) behind the user character. In
such a case, there is also a usage that prompts alerting by outputting the sound of an enemy
character behind the user character as the terminal-dedicated voice only from the terminal
device 6 side.
[0039]
The terminal device 6 is provided with a volume control slider 28. Therefore, the user can adjust
the output sound volume of the speaker 23 in real time by operating the sound volume
adjustment slider 28 even during game play. Here, when the user sets the sound volume to 0, no
sound can be heard from the speaker 23 of the terminal device 6. As a result, the terminaldedicated voice output from the terminal device 6 as described above can not be heard either.
Then, there is a risk that the game rendering effect and interest as the developer intends may be
reduced.
[0040]
Therefore, in the present embodiment, as shown in FIG. 6, when the volume is set to 0 by the
volume adjustment slider 28 (that is, when no sound can be heard from the speaker 23), only on
the terminal device 6 side. Control is performed such that the terminal-specific voice that has
been output is output from the monitor 2 side. As a result, for example, it is possible to prevent
the user from listening to the warning sound when the user character is pinched as described
above, and it is possible to reduce the deterioration of the game, etc.
09-05-2019
11
[0041]
For convenience of explanation, in the following description, it is assumed that the terminal game
sound = terminal dedicated sound. However, in another embodiment, the terminal game sound
may be composed of various sounds including a terminal dedicated sound. For example, the
terminal game sound includes the sound effect A and the sound effect B, and only the sound
effect B may be the terminal dedicated sound. In such a case, when the volume is set to 0, only
the sound effect B is output from the speaker 2a of the monitor 2, and the sound effect A is not
output from the speaker 2a of the monitor 2. In other words, only part of the sound included in
the terminal game sound may be output from the monitor.
[0042]
Next, with reference to FIGS. 7 to 10, the operation of the game system 1 for realizing the game
processing as described above will be described in detail.
[0043]
FIG. 7 shows an example of various data stored in the memory 12 of the game apparatus body 5
when the game process is executed.
[0044]
The game processing program 81 is a program for causing the CPU 11 of the game apparatus
body 5 to execute game processing for realizing the game.
The game processing program 81 is loaded into the memory 12 from, for example, an optical
disc.
[0045]
The processing data 82 is data used in the game processing executed by the CPU 11.
The processing data 82 includes terminal operation data 83, terminal transmission data 84,
09-05-2019
12
game image data 85, game sound data 86 and the like.
[0046]
The terminal operation data 83 is operation data periodically transmitted from the terminal
device 6. FIG. 8 is a diagram showing an example of the configuration of the terminal operation
data 83. As shown in FIG. The terminal operation data 83 includes operation button data 91,
touch position data 92, set volume data 93, and the like. The operation button data 91 is data
indicating an input state of the operation unit 31 (the analog stick 25, the cross key 26, and the
button 27). Further, the content of input to the motion sensor 32 is also included in the operation
button data 91. The touch position data 92 is data indicating a position (touch position) at which
an input is performed on the input surface of the touch panel 22. The set volume data 93 is data
indicating a volume value set by the operation of the volume control slider 28 by the user. For
example, the volume value is indicated by 256 levels from 0 to 255.
[0047]
The terminal transmission data 84 is data to be periodically transmitted to the terminal device 6.
The terminal transmission data 84 includes the terminal game image and the terminal game
sound.
[0048]
The game image data 85 is data to be the basis of the terminal game image and the monitor
game image. For example, data of various 3D objects appearing in the virtual game space is
included.
[0049]
The game sound data 86 is data to be the basis of the terminal game sound and the monitor
game sound. The game sound data 86 includes, for example, at least one or more of sound effect
data and BGM data. These data may be, for example, an audio file such as ADPCM format or MP3
format, or may be, for example, a format such as musical score data that can be played by the
09-05-2019
13
built-in sound source of the game apparatus body 5.
[0050]
Next, the flow of game processing executed by the CPU 11 of the game apparatus body 5 based
on the game processing program 81 will be described with reference to the flowchart of FIG.
[0051]
When execution of the game processing program 81 is started, after predetermined initialization
processing is performed, the CPU 11 acquires terminal operation data 83 in step S1 of FIG. 8.
[0052]
Next, in step S2, the CPU 11 executes predetermined game processing based on the operation
content (mainly operation content indicated by the operation button data 91 and the touch
position data 92) indicated by the terminal operation data 83.
For example, processing for moving various characters such as user characters and objects,
collision determination processing, score addition processing and the like are performed.
[0053]
Next, in step S3, the CPU 11 executes a process of generating a game image on which the result
of the game process is reflected.
For example, a game image is generated by photographing the virtual game space after the user
character has moved based on the operation content with a virtual camera. At this time, the CPU
11 appropriately generates two images of a monitor game image and a terminal game image
according to the game content. Specifically, each image is generated by using two virtual
cameras.
[0054]
Next, in step S4, the CPU 11 uses the game sound data 86 (predetermined sound effect data or
09-05-2019
14
BGM data included therein) to reflect the game sound for the terminal (in this embodiment, the
terminal) on which the result of the game processing is reflected. Generate a dedicated voice).
[0055]
Next, in step S5, the CPU 11 refers to the set volume data 93 of the terminal operation data 83
and determines whether the volume value of the terminal device 6 is zero or not.
As a result, when it is not 0 (NO in step S5), in step S6, the CPU 11 generates a monitor game
sound on which the result of the game processing is reflected based on the game sound data 86.
At this time, the monitor game sound is generated so as not to include the terminal dedicated
sound. Thereafter, the process proceeds to step S8 described later.
[0056]
On the other hand, as a result of the determination in step S5, when the volume of the terminal
device 6 is set to 0 (YES in step S5), the CPU 11 generates a monitor game sound. At this time,
the CPU 11 generates the terminal-dedicated sound so as to be included in the monitor game
sound. For example, the reproduction volume of the terminal-dedicated sound is set to a
predetermined value other than 0, the reproduction processing of the terminal-dedicated sound
is performed, and processing such as mixing with the monitor game sound is performed.
[0057]
Next, in step S8, the CPU 11 stores the terminal game image generated in step S3 and the
terminal game sound generated in step S5 in the terminal transmission data 84. Then, the CPU
11 transmits the terminal transmission data 84 to the terminal device 6. Thereby, in the terminal
device 6, predetermined game images and game sounds are output.
[0058]
Next, in step S9, the CPU 11 outputs a monitor game sound. At this time, when the volume value
09-05-2019
15
of the terminal device 6 is not 0 (when the above-mentioned step S5 is NO), the monitor game
voice not including the terminal dedicated voice can be heard from the speaker 2a. On the other
hand, when the volume value of the terminal device 6 is 0 (when the above-mentioned step S5 is
YES), the monitor game sound including the terminal-specific sound can be heard from the
speaker 2a.
[0059]
Next, in step S10, the CPU 11 outputs the monitor game image generated in step S3 to the
monitor 2.
[0060]
Next, in step S11, the CPU 11 determines whether or not a predetermined condition for ending
the game processing is satisfied.
As a result, when the predetermined condition is not satisfied (NO in step S11), the process
returns to step S1, and the above-described process is repeated. When the predetermined
condition is satisfied (YES in step S11), the CPU 11 ends the game processing.
[0061]
Next, the flow of control processing performed by the control unit 33 of the terminal device 6
will be described with reference to the flowchart in FIG. First, in step S41, control unit 33
receives terminal transmission data 84 transmitted from game device body 5.
[0062]
Next, in step S42, the control unit 33 outputs the terminal game image included in the received
terminal transmission data 84 to the LCD 21.
[0063]
Next, in step S43, the control unit 33 outputs, to the speaker 23, an audio signal based on data of
09-05-2019
16
the terminal game sound included in the received terminal transmission data 84.
However, the volume follows the value set by the volume adjustment slider 28. Therefore, when
the volume is 0, although the audio signal itself is output to the speaker 23, the audio can not
actually be heard.
[0064]
Next, in step S44, the control unit 33 detects an input (operation content) to the operation unit
31, the motion sensor 32, and the touch panel 22, and generates operation button data 91 and
touch position data 92. In addition, the control unit 33 acquires the currently set volume value of
the volume adjustment slider 28. Then, the control unit 33 generates the set volume data 93
based on the content.
[0065]
Next, in step S45, the control unit 33 generates terminal operation data 83 including the
operation button data 91 generated in step S44, the touch position data 92 and the set volume
data 93, and transmits the terminal operation data 83 to the game apparatus body 5.
[0066]
Next, in step S46, the control unit 33 determines whether a predetermined condition for ending
the control process of the terminal device 6 is satisfied (for example, whether or not the poweroff operation has been performed).
As a result, when the predetermined condition is not satisfied (NO in step S46), the process
returns to step S41, and the above process is repeated. When the predetermined condition is
satisfied (YES in step S46), the control unit 33 ends the control process of the terminal device 6.
[0067]
As described above, according to the configuration of the present embodiment, in the case of
09-05-2019
17
performing information processing to output audio only from the terminal device 6 side, even if
the volume value on the terminal device 6 side is set to 0, the monitor 2 in the meantime Such
voice can be output from the side. This can prevent the user from listening to such voices. That
is, even when the sound volume of the terminal device 6 becomes 0 in the information
processing as described above, it is possible to convey such a voice to the user when the voice
desired to be given to the user is used.
[0068]
(Modified example) In the above-mentioned embodiment, when the sound volume of the terminal
device 6 was set to 0, the example which outputs a terminal-dedicated voice from the monitor 2
side was shown. In addition, the terminal dedicated voice may be output from the monitor 2 side
even when the volume on the terminal device 6 side is not 0 but becomes equal to or less than a
predetermined value. The predetermined value is preferably such a volume that the user can not
recognize the voice if the value is equal to or less than the predetermined value. That is, the
terminal-dedicated voice may be output from the monitor 2 side even when the volume is
lowered to a level that is not 0 but can not be substantially heard by the user.
[0069]
In addition, the sound quality of the terminal-dedicated sound may be changed between when it
is output from the speaker 23 of the terminal device 6 and when it is output from the speaker 2
a of the monitor 2. That is, the terminal-dedicated sound may be output in accordance with the
characteristics of the speaker 2 a of the monitor 2 and the speaker 23 of the terminal device 6.
For example, when the terminal-dedicated voice having the same content is output from the
monitor 2, the high-quality sound may be output as compared with the output from the terminal
device 6. More specifically, with regard to certain terminal-dedicated voices, low-bit-rate voice
data (low-quality voice data such as 128 kbps MP3 data) and high-bit-rate voice data (highquality voice data such as 320 kbps MP3 data) 2 Prepare various types of data. Then, when the
terminal game sound is generated (the process of step S4), the low-quality sound data is used to
generate a terminal-specific sound. On the other hand, when the terminal game voice is included
in the monitor game voice (processing in step S7), the high-quality sound data is used to generate
the terminal voice and this process is included in the monitor game voice. good.
[0070]
09-05-2019
18
In addition, control may be performed such that the reproduction volume of the terminaldedicated sound on the monitor 2 side increases as the volume of the terminal device 6
decreases. That is, as the set volume value by the volume adjustment slider 28 of the terminal
device 6 approaches 0, the reproduction volume of the terminal-dedicated sound in the monitor
game sound may be raised (it becomes clearer and clearer). Then, as the set volume value by the
volume adjustment slider 28 of the terminal device 6 approaches 255 (maximum value), the
reproduction volume of the terminal-dedicated sound in the monitor game sound may be
decreased (not heard gradually).
[0071]
Further, the processing as described above is not limited to the game processing, and can be
applied to other information processing such as outputting a predetermined sound only from the
speaker of the terminal device having a switch capable of adjusting the volume.
[0072]
Further, in the above embodiment, a series of processes for causing the monitor side to output or
not output the terminal audio based on the set volume value of the terminal device 6 are
executed in a single device (game device body 5). In the other embodiments, the above-described
series of processes may be executed in an information processing system including a plurality of
information processing apparatuses.
For example, in an information processing system including the game apparatus body 5 and a
server side apparatus capable of communicating with the game apparatus body 5 via the
network, a part of the series of processes is executed by the server side apparatus. May be In the
above information processing system, the server side system may be configured by a plurality of
information processing apparatuses, and the plurality of information processing apparatuses may
share and execute processing to be executed on the server side.
[0073]
DESCRIPTION OF SYMBOLS 1 ... Game system 2 ... Monitor 2a ... Speaker 3 ... Game device 5 ...
Game device main body 6 ... Terminal device 20 ... Housing 21 ... LCD 23 ... Speaker
09-05-2019
19
Документ
Категория
Без категории
Просмотров
0
Размер файла
30 Кб
Теги
jp2014081727
1/--страниц
Пожаловаться на содержимое документа