close

Вход

Забыли?

вход по аккаунту

?

JP2013117996

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2013117996
Abstract: A tactile sensation is associated with sound data to support navigation and editing of
the sound data. Sound data is loaded into a memory of a computer, and the sound data is
reproduced such that sound is output from an audio device. The playback of the sound is
controlled by user input for navigation of the sound data. The haptic commands are generated
based on the sound data and are used to output haptic sensation to the user by a haptic feedback
device operated by the user. The haptic sensation corresponds to one or more characteristics of
the sound data and assists the user in identifying features of the sound data during navigation
and editing of the sound data. [Selected figure] Figure 2
Voice data output and manipulation using haptic feedback
[0001]
The present invention relates generally to systems that allow humans to interface with computer
systems, and more particularly, to methods for providing haptic feedback to users in computer
sound editing and playback environments.
[0002]
Computers have become a widely used tool for creating and editing music and other audio
related data.
Digital data representing music or other types of auditory compositions or recordings can be
10-05-2019
1
easily created and / or manipulated using available editing software such as Digidesign® Pro
Tools and others . The musician can use such software to play the sound file or any portion of the
sound data, and copy, edit, or otherwise manipulate any portion of the data. Graphical controls of
a graphical user interface, such as sliders, knobs, buttons, pointers, cursors, etc., are typically
displayed on a computer screen, which the user can manipulate to control the playback and
editing of sound data. The visual representation of sound data is generally displayed as a graph
of one or more times versus amplitudes that the user can customize to the desired scale. Some
more sophisticated systems include hardware control, such as a jog-shuttle wheel, which is a
spring centering knob, which allows the user to rotate and play the sound selection forward or
backward.
[0003]
However, one of the challenges in modern computer-operated music is to enable musicians to
engage with the computer in a manner that facilitates natural and intuitive music composition
and editing. Many of the editing and composition processes reside in ways that relate to the
physical interfaces that people use to control the computer. Traditionally, musicians have learned
to work with musical instruments that directly couple physical manipulation to the generation of
sound (e.g. the action of a piano, or a trumpet as a resonator of lip vibrations). However, it is
difficult to reproduce this type of physical relationship on a computer. Today, in many cases, the
interaction with the computer takes place through a keyboard and mouse, or in less frequent
situations, through special hardware such as a custom-developed electronic music controller.
These types of interfaces are unidirectional, allowing a musician or other user to send physical
input to a computer but not receiving physical feedback.
[0004]
Current sound editing systems require musicians to use input devices such as keyboards and
mice, passive scroll wheels, or passive joysticks while editing sounds. In these cases, musicians
must rely on auditory and visual feedback. However, musicians or users often perform repetitive
editorial tasks that require accuracy, such as navigating through a music or speech selection, to
find a particular area to edit or manipulate. Standard input devices and auditory and visual
feedback during such navigation and editing tasks are sometimes awkward, inefficient, or
inadequately accurate, thus causing frustration to the musician's creative efforts.
[0005]
10-05-2019
2
An object of the present invention is to output tactile sensation in conjunction with audio output.
Haptic sensations are associated with audio output, allowing the user to control the playback and
editing of sound data more accurately and efficiently.
[0006]
More particularly, the method of the present invention associates tactile sensations with sound
data to aid in navigation and editing of said sound data. At least a portion of the sound data is
loaded into the memory of the computer and the sound data is played back so that an audio
signal is generated and used to output the sound from the audio device. The playback of the
sound is controlled by user input received by the computer from the user for navigation of the
sound data. The haptic commands are generated based on the sound data and are used to output
haptic sensation to the user by a haptic feedback device operated by the user. The haptic
sensation corresponds to one or more characteristics of the sound data and assists the user in
identifying features of the sound data during navigation and editing of the sound data.
[0007]
Preferably, the user can control the speed and / or direction of playback of the sound data using
a speed control or position control paradigm. The haptic sensation can be continuously output
during the reproduction of sound data, and can have a magnitude based on the amplitude of the
sound data currently being reproduced. Alternatively, the tactile sensation can be output only
when the feature of the sound data having a predetermined characteristic is reproduced.
[0008]
One embodiment preprocesses the sound data to cause the haptic sensation associated with the
sound data to be output to the user when the sound data is played back. The sound data in the
memory is processed to find sound features having one or more predetermined characteristics.
Once the sound properties are found, the markers are stored in the list of markers. The marker
then indicates the position of the relevant sound characteristic in the sound data. The position is
associated with at least one haptic sensation so that when the sound data is played, the
associated haptic sensation is output to the user when the marker is reached during playback of
10-05-2019
3
the sound data.
[0009]
In another real-time embodiment, a portion of the sound data is stored in a secondary buffer and
the portion of the sound data is processed during playback of the sound data to find the
characteristics of the sound data in real time. In either embodiment, the computer can display a
visual representation of the sound data and a moving cursor that indicates the current portion of
the sound data being played.
[0010]
The present invention advantageously enables the user to experience haptic feedback as
consistent with sound output. For example, haptic feedback can be integrated into the digital
audio editing system, allowing the user to feel the haptic sensation directly related to the
reproduction of audio data and related to the manipulation performed on the audio data. The
user can navigate through the sound data to find particular locations within the data, and tactile
sensations can better inform the user when significant features are reproduced. This greatly
assists the user in navigation and editing tasks. This results in better user performance, higher
satisfaction and overall improvement of the user experience.
[0011]
These and other advantages of the present invention will become apparent to those skilled in the
art upon reading the following specification of the invention and examining several figures of the
drawings. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram illustrating a system
for providing sound data manipulation capabilities for a user enhanced with haptic feedback. FIG.
2 is a block diagram illustrating one embodiment of the haptic feedback system of FIG. 1
including a haptic feedback interface device in communication with a host computer. FIG. 3 is a
side cross-sectional view of a mouse embodiment of a haptic feedback device suitable for use in
the present invention. FIG. 4 is a perspective view of another embodiment 150 of interface device
12 suitable for use in the present invention. FIG. 5 is a flow diagram of a method for preprocessing sound data in accordance with the present invention. FIG. 6 is a schematic diagram of
a sound waveform and tactile sensation correlated with the sound waveform. FIG. 7 is a flow
chart illustrating a process for reproducing preprocessed sound data and tactile sensations in
10-05-2019
4
accordance with the present invention. FIG. 8 is a flow chart showing a real time reproduction
process for outputting haptic sensation according to the sound reproduction according to the
present invention. FIG. 9A is a schematic diagram showing the sound and haptic waveforms of
the continuous haptic output in direct output in the form of basic time versus amplitude. FIG. 9B
is a schematic diagram showing the sound and haptic waveforms of the continuous haptic output
at inverted output in the form of basic time versus amplitude. FIG. 10 is a schematic diagram of a
graphical user interface that allows the user not only to enter preferences and settings for the
present invention, but also to control sound playback. FIG. 11 is a schematic diagram of another
graphical user interface that allows the user to enter preferences and settings for the present
invention.
[0012]
FIG. 1 is a block diagram illustrating a system for providing sound data manipulation capabilities
for a user enhanced with haptic feedback. FIG. 2 is a block diagram illustrating one embodiment
of the haptic feedback system of FIG. 1 including a haptic feedback interface device in
communication with a host computer. FIG. 1 is a side cross-sectional view of a mouse
embodiment of a haptic feedback device suitable for use in the present invention. FIG. 14 is a
perspective view of another embodiment 150 of interface device 12 suitable for use in the
present invention. 5 is a flow diagram of a method for pre-processing sound data in accordance
with the present invention. FIG. 5 is a schematic diagram of a sound waveform and tactile
sensations correlated with the sound waveform. Fig. 5 is a flow chart illustrating a process for
reproducing preprocessed sound data and tactile sensations in accordance with the present
invention. Fig. 5 is a flow chart showing a real time reproduction process of outputting haptic
sensation according to the sound reproduction according to the present invention. FIG. 5 is a
schematic diagram showing the sound and haptic waveforms of continuous haptic output in
direct output in the form of basic time versus amplitude. FIG. 5 is a schematic diagram showing
the sound and haptic waveforms of continuous haptic output at inverted output in the form of
basic time versus amplitude. FIG. 5 is a schematic diagram of a graphical user interface that
allows the user not only to enter preferences and settings for the present invention, but also to
control sound reproduction. FIG. 7 is a schematic diagram of another graphical user interface
that allows the user to enter preferences and settings for the present invention.
[0013]
FIG. 1 is a block diagram illustrating a system 10 for providing sound data manipulation
capabilities for a user enhanced with haptic feedback. The host computer 14 executes a sound
10-05-2019
5
data manipulation application program that allows the user 16 to manipulate the sound data 15
by entering commands into the host computer. The user 16 operates the haptic feedback
interface device 12 to input these commands. The haptic feedback interface device allows the
user to input commands and data, and also provides kinesthetic or tactile feedback, more
commonly known herein as "tactile feedback". Using a motor or other type of actuator, these
interface devices can provide a physical sensation felt by a user who touches the device or
manipulates the user's user manifold. For example, device 12 may be a knob, mouse, trackball,
joystick, or other device that the user moves with a predetermined degree of freedom to enter
direction, value, magnitude, etc. While the user is in physical contact with the device 12 to
provide input, he or she can also experience the tactile sensation output by the haptic device 12.
In the present invention, haptic sensations are associated with editing and other sound
manipulation functions that occur in the application program of the host computer, allowing the
user to more easily perform manipulation tasks and manipulate sound data.
[0014]
The host computer 14 also outputs a signal to the audio speaker 24 to allow the user 16 to listen
to the sound data that he has selected to play. Output of sound data from the speaker in harmony
with the visual display of the host computer and the output of tactile sensation from the tactile
device 12 makes it easier for the user to experience and notice specific or preselected events in
the sound data Make it possible. This allows the user to more easily edit the sound by identifying
such events through haptics in addition to auditory and visual.
[0015]
The haptic feedback device can process inputs and outputs to the computer interface. This is
very effective for real-time tasks where rapid and efficient human physical responses are
essential for success. The haptic feedback interface can improve user efficiency and accuracy
while reducing the cognitive load required to accomplish the computer task. One of the essential
features of an effective music interface is that it allows the user to immerse in the musical
experience without being overly aware of certain physical gestures, so this kind of result is , Can
be very useful for music creation and editing. The present invention allows inexpensive haptic
devices to be incorporated into computer aided music and sound editing and creation.
[0016]
FIG. 2 is a block diagram illustrating one embodiment of the haptic feedback system of FIG. 1
10-05-2019
6
including haptic feedback interface device 12 in communication with host computer 14.
[0017]
The host computer 14 preferably includes a host microprocessor 20, a clock 22, a display screen
26, and an audio output device 24.
The host computer also includes other known components such as random access memory
(RAM), read only memory (ROM), and input / output (I / O) electronics (not shown). The host
computer 14 is a computing device that can take a wide variety of forms. For example, in the
described embodiment, computer 14 is a PC compatible computer or Macintosh personal
computer, or a personal computer or workstation, such as a Sun or Silicon Graphics workstation.
Such computer 14 may operate under Windows®, MacOS®, Unix®, MS-DOS, or other operating
system. Alternatively, host computer 14 may be one of a variety of home video game console
systems commonly connected to a television set or other display device such as those available
from Nintendo, SEGA, Sony, or Microsoft. Can be In other embodiments, host computer system
14 may be a "set top box", "network", or "Internet computer", portable computer or gaming
device, consumer electronics (such as stereo components), PDA, etc. .
[0018]
The host computer 14 preferably implements a host application program that allows the user to
interact through the device 12 and other peripherals as appropriate. In the context of the present
invention, a host application program is a digital audio editing program, as described in further
detail below. Other application programs can also be used that utilize the input of device 12 and
output haptic feedback commands to device 12. The host application program preferably
presents options to the user utilizing a graphical user interface (GUI) and receives input from the
user. The application program may include haptic feedback functionality as described below, or
haptic feedback control may be implemented in another program executed on the host computer,
such as a driver or other application program. Here, computer 14 may be said to present a
"graphical environment", which may be a graphical user interface, game, simulation, or other
visual environment. The computer displays a "graphical object" or a "computer object". They are
not physical objects, but, as is well known to those skilled in the art, logical software unit sets of
data and / or procedures that can be displayed as an image on display screen 26 by computer
14. Suitable software drivers for interfacing software with haptic feedback devices are available
from Immersion Corporation of San Jose, California.
10-05-2019
7
[0019]
Display 26 may be included in host computer system 14 and may be a standard display screen
(LCD, CRT, flat panel, etc.), 3D goggles, projection device, or other video output device. The
display device 26 displays an image instructed by an operating system application, simulation,
game or the like. An audio output device 24, such as a speaker, provides a sound output to the
user. In the context of the present invention, other audio related devices such as mixers,
amplifiers, specialized hardware etc. can also be connected to the host computer. Other types of
peripheral devices may also be connected to host processor 20, such as storage devices (hard
disk drives, CD ROM drives, floppy disks, etc.), printers, and other input / output devices.
[0020]
Interface devices 12 such as mice, knobs, game pads, trackballs, joysticks, remote controls, etc.
are connected to host computer system 14 by bi-directional bus 30. The bi-directional bus
transmits signals in either direction between the host computer system 14 and the interface
device. The bus 30 can be a serial interface bus such as RS232 serial interface, RS-422, Universal
Serial Bus (USB), MIDI, or other protocols known to those skilled in the art, or a parallel bus or
wireless link can do. Some interfaces may also provide power to the actuators of device 12.
[0021]
Device 12 may include a local processor 40. A local processor 40 can optionally be included in
the housing of the device 12 to enable efficient communication with other components of the
mouse. Processor 40 may comprise software instructions to wait for commands or requests from
computer host 14, decode commands or requests, and process / control input and output signals
according to the commands or requests. In addition, processor 40 is independent of host
computer 14 by reading the sensor signals and calculating the appropriate force from the stored
or relayed instructions selected according to these sensor signals, time signals, and host
commands. Can operate. Suitable microprocessors for use as the local processor 40 include, for
example, MC68HC711E9 by Motorola, PIC16C74 by Microchip, and 82930AX by Intel, as well
as more sophisticated force feedback processors such as Immersion Touch Sense Processors. is
there. Processor 40 may include one microprocessor chip, multiple processors and / or coprocessor chips, and / or digital signal processor (DSP) functions.
10-05-2019
8
[0022]
Microprocessor 40 may receive signals from sensor 42 and provide signals to actuator assembly
44 in accordance with instructions provided by host computer 14. For example, in a local control
embodiment, host computer 14 provides high level monitoring commands to processor 40
through bus 30, and processor 40 decodes commands independently of the host computer
according to host computer 14 high level command commands. Manage the low level force
control loop to the sensor and the actuator. This operation is described in detail in US Pat. Nos.
5,739,811 and 5,734,373, both of which are incorporated herein by reference. In the host
control loop, a force command from the host computer instructs the processor to output a force
or force sensation having the specified characteristics. The local processor 40 reports location
and other sensor data to the host computer, which uses it to update the running program. In the
local control loop, an actuator signal is provided from processor 40 to actuator 44 and a sensor
signal is provided from sensor 42 and other input device 48 to processor 40. Processor 40 may
process the input sensor signals to determine the appropriate output actuator signal by following
stored instructions. Here, the terms "tactile sensation" or "feeling" refer to either a single force or
a series of forces output by an actuator assembly that imparts a sensation to the user.
[0023]
In yet another embodiment, other simpler hardware may be provided locally to device 12 to
provide functionality as processor 40. For example, using a hardware state machine or ASIC
incorporating fixed logic, provide signals to the actuator 44, receive sensor signals from the
sensor 42, and output tactile signals according to a predetermined sequence, algorithm or
process be able to.
[0024]
In different host control embodiments, the host computer 14 can provide low level force
commands through the bus 30, which are transmitted directly to the actuator 44 via the
processor 40. The host computer 14 thus directly controls and processes all signals to or from
the device 12. In a simple host control embodiment, can the signal from the host to the device be
commanded to output a force of predetermined frequency and magnitude to the actuator, or
include magnitude and / or direction? Or it can be a simple command that indicates the desired
10-05-2019
9
force value to be applied over time.
[0025]
A local memory 52 such as RAM and / or ROM is preferably connected to the processor 40 in the
device 12 to store instructions for the processor 40 and to store temporary data and other data.
For example, a force profile may be stored in memory 52, such as a series of stored force values
that can be output by the processor, or a look-up table of force values output based on the
current position of the user object . In addition, a local clock 54 can be connected to the
processor 40 to provide timing data similar to the host computer 14 system clock. Timing data
may be needed, for example, to calculate the force output by the actuator 44 (e.g., a force
dependent on the calculated velocity or other time dependent factor). In embodiments that use a
USB communication interface, timing data for processor 40 may alternatively be retrieved from
the USB signal.
[0026]
The sensor 42 senses the position or movement of the device and / or the manipulandum and
provides a signal to the processor 40 (or host 14) including information indicative of the position
or movement. Sensors suitable for detecting operation include digital light encoders, light sensor
systems, linear light encoders, potentiometers, light sensors, speed sensors, acceleration sensors,
strain gauges, or any other type of sensor It can be used and can be either a relative sensor or an
absolute sensor. An optional sensor interface 46 can be used to convert the sensor signals into
signals that can be interpreted by the processor 40 and / or the host 14.
[0027]
The actuator 44 transmits force to the housing of the device 12 or one or more manipulators 60
in response to signals received from the processor 40 and / or the host computer 14. The
actuator 44 may be any of a number of types of actuators, including DC motors, voice coils,
pneumatic or hydraulic actuators, torquers, active actuators such as piezoelectric actuators,
moving magnet actuators etc, or passive actuators such as brakes. can do.
[0028]
10-05-2019
10
An actuator interface 50 may optionally be connected between the actuator 44 and the processor
40 to convert signals from the processor 40 into signals suitable for driving the actuator 44.
Interface 50 may include power amplifiers, switches, digital-to-analog controllers (DACs), analogto-digital controllers (ADCs), and other components, as known to those skilled in the art. Another
input device 48 is included in the device 12 and sends input signals to the processor 40 or host
14 when manipulated by the user. Such input devices can include buttons, scroll wheels, d-pads,
dials, switches, or other controls or mechanisms.
[0029]
A power supply 56 connected to the actuator interface 50 and / or the actuator 44 may
optionally be included in the device 12 or provided as a separate component to provide power to
the actuator. Alternatively, power may be drawn from a power supply separate from device 12 or
received via bus 30. Some embodiments may use a power storage device within the device (as
described in US Pat. No. 5,929,607, which is incorporated herein by reference) to ensure that
peak forces can be applied. . Alternatively, this technique can be used for wireless devices, where
battery power is used to drive the haptic actuator. A safety switch 58 may optionally be included
to allow the user to deactivate the actuator 44 for safety reasons.
[0030]
The actuator 44 outputs a force to the housing of the interface device 12 and / or the manifold
600. The sensor 42 can sense the position or movement of the housing or manipulandum 60.
Many types of interface or control devices can be used in the invention described herein. For
example, such interface devices may be haptic feedback trackballs, joystick handles, steering
wheels, knobs, handheld remote control devices, game pad controllers for video or computer
games, styluses, grips, wheels, buttons, cell phones, PDAs, etc. A touch pad or other steerable
object, surface or housing may be included.
[0031]
FIG. 3 is a side cross-sectional view of an embodiment 100 of a mouse of device 12 suitable for
use in the present invention.
10-05-2019
11
[0032]
The mouse device 100 includes a housing 101, a sensing system 102, and an actuator 104.
The housing 101 is shaped to fit the user's hand like a standard mouse while the user moves the
mouse in planar freedom and operates the button 106. Other housing shapes can be provided in
many different embodiments.
[0033]
The sensor 102 detects the position of the mouse within its planar freedom, for example along
the X and Y axes. In the described embodiment, sensor 102 includes a standard mouse ball 110
to provide directional input to a computer system. Alternatively, optional sensors or other types
of sensors can be used.
[0034]
The mouse device 100 includes one or more actuators 104 to provide tactile feedback, such as
tactile sensation, to the user of the mouse. The actuator 104 is connected to the housing 101 to
provide haptic feedback to the user. In one embodiment, the actuator is connected to the moving
inertial mass by the actuator. The inertial force generated by the motion of the inertial mass is
applied to the housing of the mouse relative to the inertial mass, thereby providing tactile
feedback such as tactile sensation to the user of the mouse touching the housing. In some
embodiments, the actuator itself can move as an inertial mass. Such embodiments are described
in more detail in US Pat. No. 6,211,861 and US patent application Ser. No. 09/585741, both of
which are incorporated herein by reference in their entirety. Other types of interface devices,
such as game pads, hand-held remote controls, cell phones, PDAs, etc. can include actuators for
inertial touch.
[0035]
Other types of interface devices and actuators can also be used in the present invention. For
example, a game pad, mouse, or other device may include an eccentric rotating mass coupled to
10-05-2019
12
the rotational axis of the actuator to provide an inertial feel to the device housing or
manipulandum. Other types of haptic devices, such as joysticks, knobs, scroll wheels, game pads,
steering wheels, trackballs, mice, etc. can provide kinesthetic force feedback, and the force is the
manipulandum's perceived freedom Output. For example, an embodiment of a kinesthetic mouse
tactile device is disclosed in US Pat. Nos. 6,100,874 and 6,166,723, both incorporated herein by
reference in their entirety.
[0036]
FIG. 4 is a perspective view of another embodiment 150 of interface device 12 suitable for use in
the present invention. The knob device 150 includes a control knob 152 operated by the user to
control various functions of the electronic device or host computer 14. For example, the knob
device can be provided in a separate housing connected to a host computer 14 running the
editing software described below, or a complete control panel including other controls associated
with audio editing or other control functions as desired. Can be provided. The display 26 (or a
display dedicated to the knob device) of the host computer 14 can display editing controls as
described below.
[0037]
Control knob 152 allows the user to directly manipulate the features and settings of the present
invention. The knob 152 can be a generally cylindrical object engageable by a user. In the
described embodiment, the knob 152 rotates with one rotational degree of freedom about an
axis, such as an axis A extending from the knob, as indicated by the arrow 154. The user
preferably grips or touches the circumferential surface 156 of the knob 152 and rotates it by the
desired amount. In alternative embodiments, multiple knobs 152 may be provided such that each
knob provides different or similar control functions.
[0038]
Additionally, some embodiments of control knob 152 can provide additional control functionality
for the user. Preferably, the control knob 152 can be pushed and / or pulled with a degree of
freedom along axis A (or approximately parallel to axis A), this motion being sensed by an axial
switch or sensor. This provides the user with an additional way to select features or settings
without having to release his or her grip from the knob. For example, in one embodiment, the
10-05-2019
13
user can move a cursor or other indicator displayed on display 14 using the rotation of knob
152. When the cursor is moved to the desired setting or area on the display, the user can press
knob 152 to select the desired setting. The push and / or pull functionality of the knob 152 can
be provided with a spring return bias or realized to remain in the pushed or pulled position until
the user actively moves the knob to a new position be able to.
[0039]
In other embodiments, the knob 152 can be moved by the user in one or more lateral or lateral
directions in a plane substantially perpendicular (perpendicular) to the rotation axis A. This
lateral movement is indicated by arrow 158. For example, the knobs 152 can be moved in the
four orthogonal or four diagonal directions shown. This lateral movement of the knob 152 allows
the user to select additional settings or functions of the controlled device, such as mode selection,
cursor position, or setting of values or magnitudes.
[0040]
The knob 152 preferably comprises force feedback of at least the rotational freedom of the knob.
An actuator 160, such as a rotating DC motor, can be provided having an axis connected to the
knob 152. The actuator can output a force that provides detents, springs, brakes, barriers, or
other force sensations to the rotating knob. A sensor for reading the rotational position of the
knob can be integrated into the actuator or provided separately. Alternatively or additionally,
lateral and / or linear axial movement of the knob can be actuated. The knob hardware
implementation is described in detail in US Pat. No. 6,154,201, which is incorporated herein by
reference. Some force sensations are described in US Pat. Nos. 5,734,374 and 6,154,201, which
are incorporated herein by reference in their entirety.
[0041]
Output and Manipulation of Sound Data by Haptic Feedback The present invention improves the
user's experience in manipulating digital sound data (also referred to as "audio data") by using
the haptic sensations output by the haptic feedback interface device.
[0042]
10-05-2019
14
FIG. 5 is a flow diagram of one embodiment 200 of the pre-processing of the present invention.
This method can be realized by an application program such as an example sound composition /
editing program used in the described embodiments. Alternatively, the method can be
implemented using a separate application program or driver program that is executed together
with the sound composition / editing program. Other types of application programs can also be
used in the present invention.
[0043]
The method starts at 202 and at step 204 the settings made by the user are read. For example,
the user can pre-enter settings in the fields displayed in the graphical user interface. Examples of
such user interfaces are discussed below in conjunction with FIGS. 10 and 11. The settings allow
the user to adjust the tactile sensation and their relationship to the sound being played or edited.
The user settings can also be entered in other ways, for example by means of a program reading
a database, via a networked storage device or the like.
[0044]
In the next step 206, all or part of the sound file or sound data, or in particular the file containing
the sound data, is loaded into the memory. Sound data generally consists of a series of individual
digital sound samples that indicate to a driver or other I / O program or device how to generate
an audio signal from the samples, as is well known to those skilled in the art. . This sound data is
preprocessed by the method of the present invention, and it is possible to output a tactile
sensation when the sound data is later played back to the user by a speaker or other audio
output device. If the method is implemented by a separate program other than a sound editing or
playback program, the separate program can load sound data into memory for preprocessing.
[0045]
At the next step 208, the sound data loaded into memory is preprocessed to find the desired
sound characteristics to be associated with the haptic sensation. For example, the method may
consider the sound file in such a way that the sound file is played or output by the playback
device, for example in a time sequence or in another embodiment another sequence.
10-05-2019
15
[0046]
The desired sound characteristics to be explored can be different in various embodiments and
can be influenced by the user settings read in step 204. In one embodiment, the method can
search for an extreme rise in amplitude in sound data. This is illustrated in the waveform diagram
220 of FIG. The sound data represents a sound waveform 222 and may have short peaks 224 of
sound corresponding to short or sudden features in the sound, such as drum beats in music or
other percussion sounds. For example, the rise in amplitude at the first peak 224 may be
characteristically sought at step 208. It is preferred to impose a minimum threshold rise in
amplitude. For example, the threshold may be a predetermined percentage, for example 50%,
greater than the average amplitude of the sound waveform. For all of the sound data loaded into
memory in step 206, the average amplitude of the sound data can be calculated in advance. The
threshold percentage value may be the user setting loaded at step 204 or may be a default value.
[0047]
The method can also use more sophisticated methods, including error correction, for example, so
that noise is not detected as the desired sound characteristic. For example, to identify features as
peaks in sound data to which tactile sensations should be assigned, the method may register the
features as peaks in the sound data, such that after rising, the sound waveform is averaged at
that point in the sound data. It can be required to fall below a certain percentage. If the waveform
does not have this fall, then the drop in amplitude found can be considered as a false drop, eg
noise, so a drop in true amplitude may occur later in the waveform. This test reduces the chance
of accidentally finding multiple peaks when there is only one desired peak and noise that caused
a small drop in amplitude.
[0048]
Other error correction measures can be taken to prevent any sound features from being overcounted as desired haptic features. For example, if the method encounters a peak of sound data,
the method can examine the previous sound peak found in the sound data. If the currently found
peak is not separated from the previous peak by a predetermined minimum time interval, the
current peak should not be counted as the desired haptic sound characteristic. This prevents the
user from assigning tactile sensations to sound characteristics that are too close in time during
10-05-2019
16
playback so that they can not be distinguished, thus saving processing time. In other
embodiments, other comparisons can be made between the current sound characteristics and
previously found characteristics to determine whether the current characteristics should be
considered assignable to tactile sensations. Other characteristics of the sound data may also be
explored, such as a predetermined frequency, a predetermined percentage decrease in amplitude,
etc.
[0049]
If a sound feature is found in the sound data, then in the next step 210, a marker indicating the
position of the found sound feature is added to the running list of markers. The marker can
simply be a number indicating the position where the sound property begins, or the number of
the sound sample at some other standardized position of the property (e.g. relative to the start of
the sound data). For example, the marker is represented in FIG. 6 by dashed line 226 at the
beginning of peak 224. The identification number (or other identifier) of the sound sample at the
location of line 226 can be stored in the marker list.
[0050]
The list of markers is the desired sound characteristic found in the previous iteration of the
invention that matches the tactile sensation, for example, the tactile association characteristic
that has previously occurred in the sound data stream if sequential processing is realized. All
included. The list is organized so that, during playback, the sound samples in the list can be easily
compared to the sound samples currently being played.
[0051]
Some embodiments merely present the markers in the list, and all markers are associated with
standard tactile sensations such as detents. Thus, whenever a marker is reached during playback,
a detent is output to the user. In other more complex embodiments, each marker can be
associated with a type of tactile sensation based on the type or characteristics of the found sound
feature. For example, an indication of an associated tactile sensation (its type and / or
parameters) can be stored with each marker in the marker list. In another embodiment, a marker
can be placed for each sound sample or for every few sound samples, and the force value is
calculated relative to each marker to be proportional to the amplitude of the sound of that
10-05-2019
17
sample. This allows the haptic output to continuously match the amplitude of the sound data.
However, such continuous matching is better realized in real-time processing embodiments, as
will be described in connection with FIG. 8 below.
[0052]
At the next step 212, the end of the sound file or sound data is reached and it is checked whether
there is any more data left to be preprocessed. If this is not the case, the method returns to step
208 to further preprocess the sound data until another desired sound characteristic is found. If
there is no more data requiring pre-processing, the method ends at 214.
[0053]
FIG. 7 is a flow chart illustrating a method 250 for reproducing preprocessed sound data and
haptic sensations in accordance with the present invention. In this method, it is assumed that the
user or application program outputs sound to the user via a speaker or other audio device. In one
embodiment to which the example method is primarily relevant, the application program is a
music composition / editing program that allows the user to play music through the haptic
device 12 based on input from the user. For example, the user can play the sound by operating
the knob device to turn the knob in one direction to play the sound forward and turn the knob in
the reverse direction to play the sound in the reverse direction . Alternatively, the user can use a
mouse, joystick or other device. Users often perform this forward and reverse playback to listen
to sound samples that the user has just edited or composed using the application program, and
also to use this feature to further edit the sound data The cursor can also be placed at the desired
position of the sound data stream.
[0054]
The method starts at 252, and at step 254 the method checks whether to start or continue
playing the sound. If not, the method ends at 256. If so, at step 258, the user may receive new
user input, such as rotating the knob or other manipulandum of the haptic device to a new
position. Alternatively, the user may not be able to provide new input. At step 260, the sound
data is played through the speaker in either the forward or reverse direction indicated by the
user. The direction may correspond to the direction of rotation of the knob, the direction of
movement of the mouse or joystick or wheel, etc.
10-05-2019
18
[0055]
Some embodiments can provide a speed control mode that allows the user to control the speed of
sound playback. The playback speed is based on the current position of the manipulandum or
object from the reference position. For example, in the knob embodiment, the further the user
rotates the knob away from the home position, the faster the sound playback speed. In some
embodiments, the user also has a spring resistance on the knob to assist in this speed control.
Another embodiment may provide a position control mode in which the music is played only
when the user manipulandum is moving, and the specific amount of movement corresponds to a
specific amount or duration of music reproduction. For example, in the position control mode, the
user can continuously rotate the knob clockwise to listen to music playback, and adjust the speed
of its rotation to adjust the speed of playback. The sound stops playing when the user stops
turning the knob. The mode can also determine which haptic sensation to output. For example,
time-based pops or impacts are suitable for speed control mode, and position-based detents or
splashes can be output in position control mode.
[0056]
At the next step 262, the method plays the haptic effect if a marker in the sound data is reached.
The markers are preferably arranged in a list based on the pre-processing of FIG. 5, each marker
having a position in the associated sound sample or sound data. When the sound sample or
position is reached during playback, it is instructed to output the haptic effect associated with the
marker. For example, as shown in FIG. 6, tactile sensation is specified by line 228. When the
marker 226 is reached during playback, the associated tactile sensation is output, which in the
present example is the force detent indicated by the indentation 230 at line 228. As mentioned
above, some embodiments can store the type of tactile sensation with the particular marker
reached, allowing a wide variety of tactile sensations to be output.
[0057]
Thus, the haptic sensation is immediately output to the user by the haptic device simultaneously
with the output of the associated sound feature. For example, the output drumbeat sound
simultaneously coordinates with the output of force pulses or impulses from the haptic device to
the user. The method then returns to step 254 to check if the regeneration should continue. If so,
10-05-2019
19
the sound is played and the haptic effect is played when the marker is reached in a similar
manner.
[0058]
Some embodiments can change the output tactile sensation based on other factors, such as user
input. For example, tactile sensation can be based on the speed of sound reproduction controlled
by the user. In one embodiment, the magnitude of detents or other tactile sensations can be
directly proportional to the speed of playback or based on the relationship of the current
playback speed to a predetermined threshold speed.
[0059]
Besides music, other sounds such as speech can be played, for example only the sound of a movie
or dialogue. Thus, the tactile sensation can be based on speech characteristics in the sound data.
Some programs can delimit speech characteristics and allow the user to edit sound data
representing speech output, such as sentences, or pauses between or within words or words
within sentences.
[0060]
Thus, the present invention provides a tactile sensation that correlates with the output of the
sound data feature. The user can navigate through the sound data to find specific points in the
serial data, and the tactile sensations may be scrolled by the cursor or pointer provided by the
application program or significant features such as peaks The user can be notified when the
position is located. This greatly aids the user's navigation and editing tasks.
[0061]
FIG. 8 is a flow diagram illustrating a real time playback process for outputting haptic sensations
in accordance with sound playback according to the present invention. In contrast to the method
of FIGS. 5 and 7 in which the characteristics to be mapped to the haptic sensation are preprocessed before actual sound reproduction, in the present method the sound data is processed
10-05-2019
20
during reproduction and thus pre-processing Time and calculations are saved.
[0062]
The method starts at 302, and at step 303 sound data is loaded into memory (primary buffer). As
mentioned above, this may be a sound file to be played, or a file containing sound data in other
data to be played (eg visual data). At step 304, the method is checked to start or continue sound
playback. If not, the method ends at 306. If so, at step 308, new user input may be received, such
as the user rotating the knob or other manipulandum of the haptic device to a new position, as
described above for FIG. At step 310, the sound data is played through the speaker in the
direction indicated by the user, as described for the method of FIG. Speed control modes, position
control modes, or other control paradigms can be used. For playback, a playback program (such
as a music editing program) can provide a "play pointer" that indicates whether the current
sound sample being played or the sample in the sound data is to be played next. Will simply
move along with the sound sample as it is output.
[0063]
At the next step 312, the portion of the sound data near the playback pointer is stored in the
secondary buffer. For example, this may be a predetermined amount of sound data, such as a
100 millisecond sound played immediately after the current position of the play pointer. At step
314, the data in the secondary sound buffer is processed and analyzed for sound characteristics.
This step may be similar to the processing step 208 of FIG. 5, where the features of the sound
data desired to be associated with the haptic sensation are sought. For example, increase and / or
decrease in amplitude can be examined. All user preferences or settings can be used in helping to
determine whether there are sound characteristics to be mapped to haptic sensations and the
particular haptic sensations to be mapped. In other embodiments, such as where the size of the
tactile sensation continuously follows the sound amplitude (see FIGS. 9A and 9B), it is not
necessary to find specific features of the sound.
[0064]
At the next step 316, based on the sound characteristics found at step 314, haptic action is sent
to the haptic device to be reproduced. For example, if an increase in the amplitude of the sound is
found, the detent feeling can be output as in the case where the marker is found by the pre-
10-05-2019
21
processing method of FIG. Alternatively, it is possible to output a tactile sensation having a
magnitude based on the amplitude of the sound continuously. For example, FIG. 9A is a
schematic diagram showing two waveforms 330 and 332 in basic time-to-amplitude form.
Waveform 330 represents the sound amplitude of the sound data over time. In this real-time
processing embodiment, the method can calculate the amplitude of the tactile sensation
represented by waveform 332, which is continuously proportional to the sound amplitude of the
sound data of the secondary buffer. For example, the resistance can be output to a maniplud such
as a knob, where the magnitude of the resistance corresponds to the current sound amplitude
and varies with the sound amplitude. In FIG. 9B, the sound waveform amplitude 330 is similarly
shown, but the tactile sensation amplitude 336 may continuously change inversely to the sound
amplitude to provide a different haptic experience to the user. Other continuous tactile sense
mapping or sound characteristic sense mapping can also be used.
[0065]
After the sound characteristics are found and the tactile sensations are mapped and commanded
to be output, the user feels tactile sensations approximately at the time the sound samples
corresponding to the tactile sensations are being played by the speaker. Once the haptic effect
has been reproduced for the found sound characteristic, the method returns to step 304 and
checks whether to continue the sound reproduction.
[0066]
FIG. 10 is a schematic diagram of a graphical user interface 400 that not only allows the user to
enter preferences and settings for the present invention, but also allows the user to easily control
sound playback to test user settings. For example, these settings can be incorporated into an
application program, such as a sound / music editing program, or can be used in a separate test
program.
[0067]
The sound control parameters 402 can include a sound file field 404, which allows the user to
select a desired sound file to be played, including sound data with which haptic sensations are
correlated. The sound file can be loaded into memory as a whole, and in some embodiments, an
inverted version of the file can be loaded into the second buffer to allow for reverse playback.
10-05-2019
22
Status field 406 may display the current status of the program, such as opening a file, processing
sound data, creating a reverse buffer for reverse playback, and so on. The playback adjustment
field 408 allows the user to enter a value to customize the playback of the sound data, and the
frequency field to adjust the playback speed (automatically set the normal frequency for
playback of the file first Can be), a pan field to adjust the balance of playback between the left
and right speakers, and a volume field to adjust the output amplitude of the sound. A slider bar
410 can also be used to enter these user settings. The loop sound box 412 allows the user to
select whether the sound file will repeat or stop playing when the end of the data of the sound
file is reached. The button 414 allows the user to start, stop or pause playback of the sound file.
[0068]
Processing parameters 420 allow the user to adjust parameters that affect the way tactile
sensations are generated based on sound data. The peak threshold parameter 422 can be used to
specify the amount of rise of the sound signal relative to the average sound level that triggers
haptic events as described above. The parameter can be specified as a percentage of the average
sound amplitude. For example, a 50% or more increase in mean amplitude is a significant
increase sufficient to cause tactile sensation to be associated with that increase. The peak reset
parameter 424 is considered significant enough to be the peak of the sound amplitude after the
detected rise, assures tactile sensation and prevents finding the false multiple peaks described
above (average sound Specifies the percentage reduction of the sound amplitude (as compared to
the amplitude). The minimum beat interval parameter 426 is another error check parameter that
allows the user to specify a time interval (eg, in milliseconds) which is the minimum interval that
must exist between two peaks to count the second peak. It is. If there is no minimum spacing, the
second peak is considered as a false peak caused by noise, as described above. The window size
parameter 428 allows the user to specify the window size in number of sound samples, which
defines the resolution used to average the sound data amplitude. The method can average the
samples in each window and then average all the windows together to find an average sound
amplitude.
[0069]
Device selection 430 allows the user to select which type of haptic device 12 is currently used in
the system and which haptic sensation is output. Different devices require different commands to
output the tactile sensation. A knob device with position control or speed control, a tactile
feedback mouse, and a kinesthetic feedback mouse are shown as options, but any tactile device
can be used. The mouse control field 432 allows the user to control sound playback using a
10-05-2019
23
mouse cursor and can be used to test the tactile sensation output altered by the sound playback
and user preferences and settings. For example, shuttle field 436 allows the user to move the
cursor within the shuttle field to play the sound file in position control mode. Pressing the button
while moving the cursor left and right in the shuttle field 436 plays the music at a rate
proportional to the speed of the cursor, moving to the right for forward play and moving to the
left for reverse play. Scroll field 434 allows the user to move the cursor within the field to play
the sound file in speed control mode. The user places or moves the cursor to affect the direction
and speed of playback. If the cursor position is to the right of the center point of field 434,
forward playback occurs at a rate proportional to the distance between the cursor and the center
point, and if the cursor position is to the left of the center point, proportionally as well. Reverse
playback occurs.
[0070]
The test button 438 is a pre-processing embodiment described with respect to FIG. 5 and allows
the user to initiate re-processing of sound data in memory when processing parameters or device
selections are changed.
[0071]
FIG. 11 is a schematic diagram of another graphical user interface 500 that allows the user to
enter additional preferences and settings for the present invention.
If desired, any or all of these settings, along with some or all of the settings of FIG. 10, can be
included in a single interface. Sound file field 502 allows the user to specify a sound file, and
haptic device settings 504 allow the user to select the type of haptic device.
[0072]
The filter 506 allows the user to customize the frequency range of the filter that can be used in
the sound data processing step of the method of FIG. 5 or 8 in some embodiments of the present
invention. Low pass filters, high pass filters, and / or band pass filters can be used to isolate
different frequency ranges of sound data to which haptic sensation can be applied. The user can
set cutoff and range limits for these filters, the low pass filter removes frequencies above the user
specified cutoff, and the high pass filter removes frequencies below the user specified cutoff. The
band pass filter removes frequencies outside the user specified frequency range.
10-05-2019
24
[0073]
The filters can be useful for sound editing tasks as different types of sounds depend on different
frequency ranges. For example, since the meaningful information of speech is at higher
frequencies, users who edit speech will want to apply a high pass filter so that only relevant
frequencies are present. The desired features of the speech are then more easily found at these
relevant frequencies. Similarly, music that relies heavily on strong beats or rhythm (e.g. rock and
roll, dance music, rap) contains a lot of information on low frequencies, while many classical
music centers on frequencies in the high and mid range. Depending on the style of music being
played, appropriate filters can be used to separate meaningful frequency ranges. Filters are also
useful when editing the sound of an individual instrument of sound data, as the frequency
components can be controlled to match the characteristics of the instrument.
[0074]
The harmonic tracking settings 508 can be set by the user on or off. The harmonic tracking
feature, when set to on, can generate a mapping of tactile sensation (eg texture or vibration) to
harmonic components or timbre of the sound. Haptic textures are spaced or organized in a
predetermined manner and are output as user manipulandum of the haptic device moves over
each "bump" location (eg, detent, impact, etc.) of the texture field , A pattern of short tactile
features such as detents or impacts. The haptic texture can be associated with the timbre of the
output sound. For example, a clean and simple tactile texture with "bumps" spaced further apart
can be output tactilely while outputting pure tones as sound. In contrast, complex and dense
textures with closely spaced bumps can be output haptically as complex sounds of sound are
output. The sound of the flute is rather pure and can be mapped to a very light and simple tactile
texture or even no texture at all. The other end of the spectrum is a super complex of electric
guitar distortion and can be mapped to heavy, complex tactile textures. Texture is more suitable
for position control mode. In the speed control mode, vibrations can be output as well, with low
frequency vibrations corresponding to textures spaced far apart and high frequency vibrations
corresponding to textures spaced close together.
[0075]
The effect setting allows the user to adjust how to feel part of the tactile sensation. The
10-05-2019
25
continuity setting 510 allows the user to select whether to change the haptic output continuously
or on an event basis. Selecting "Continuous" causes haptic output to change in real time with
changes in sound characteristics, as described above in connection with FIGS. 9A and 9B. This
can be selected if the user wants to continuously track the sound's amplitude during navigation.
When "event-based" is selected, only significant events in the sound data trigger haptic events,
such as the increase in amplitude described above in connection with FIG. For example, beats,
amplitude peaks can trigger tactile detents. In other embodiments, the user can select an option
that allows both continuous tactile output and event based tactile output, such as continuous
resistance supplemented with impact or detent superimposed on the resistive force.
[0076]
The sound to force magnitude mapping setting 512 allows the user to select how to output the
haptic output response. For example, if direct mapping is selected, as described above in
connection with FIG. 9A, the larger the sound amplitude, the stronger the magnitude of the
haptic force generated. If reverse mapping is selected, the larger the sound amplitude, the
weaker the magnitude of the haptic force generated, as described above in connection with FIG.
9B. This mapping can be applied to the continuous setting of the continuity setting 510, or to an
event based setting (for example, an inverted detent can be output).
[0077]
Waveform 514 is a graphical representation of a waveform representing sound data loaded into
memory. The waveform can be displayed, for example, when the user navigates the sound data.
The entire sound data file can be represented as a waveform, or a portion of the sound data can
be represented. In the illustrated example, the cursor 516 is a vertical bar indicating the current
playback position in the waveform. During navigation and sound playback, the cursor 516 moves
in a direction corresponding to the playback direction and moves at the speed of playback
(Alternatively, the entire waveform can be scrolled and the cursor can be stationary). Thus, the
user can visually sense as the cursor moves over features in the sound and feel the tactile
sensation corresponding to these features. In these pre-processing embodiments (see FIG. 5),
which store markers indicating sound characteristics mapped to haptic sensations, the markers
may also be displayed at their position relative to the waveform.
[0078]
10-05-2019
26
The invention can be used for independent sound files as used for music or speech, which can be
one of many standardized formats (wav, mp3, MIDI etc.). In addition, the present invention can
be used for sound data that can be included with other data that describes visual representations
such as videos, movies, and animations.
[0079]
While the present invention has been described in terms of several preferred embodiments,
variations, permutations, and equivalents thereof will become apparent to those skilled in the art
upon reading the present specification and studying the drawings. For example, many different
embodiments of the haptic feedback device can be used to output the haptic sensations described
herein. Furthermore, the use of specific terminology is for the purpose of clarity and not
limitation on the present invention.
10-05-2019
27
Документ
Категория
Без категории
Просмотров
0
Размер файла
49 Кб
Теги
jp2013117996
1/--страниц
Пожаловаться на содержимое документа