close

Вход

Забыли?

вход по аккаунту

?

JP2017067666

код для вставкиСкачать
Patent Translate
Powered by EPO and Google
Notice
This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate,
complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or
financial decisions, should not be based on machine-translation output.
DESCRIPTION JP2017067666
Abstract: An object detection apparatus capable of improving the detection accuracy of an object
is provided. An object detection apparatus includes a microphone array including a plurality of
omnidirectional microphones, and a processor that processes first acoustic data collected by the
microphone array. The processor sequentially changes the direction of directivity based on the
first acoustic data to generate a plurality of second acoustic data having directivity in any
direction, and the sound pressure level of the second acoustic data And the frequency component
is analyzed, and the sound pressure level of the specific frequency included in the frequency
component of the second acoustic data having directivity in the first direction among the
arbitrary directions is equal to or higher than the first predetermined value, It is determined that
an object is present in the first direction. [Selected figure] Figure 2
Object detection apparatus, object detection system, and object detection method
[0001]
The present disclosure relates to an object detection device that detects an object, an object
detection system, and an object detection method.
[0002]
BACKGROUND Conventionally, there has been known a flying object monitoring apparatus
capable of detecting the presence of an object and detecting the flying direction of an object
using a sound detection unit that detects sound in each direction (see, for example, Patent
Document 1).
03-05-2019
1
[0003]
JP, 2006-168421, A
[0004]
In the flying object monitoring apparatus described in Patent Document 1, the detection accuracy
of the monitoring target flying object is insufficient.
[0005]
The present disclosure has been made in view of the above circumstances, and provides an object
detection apparatus, an object detection system, and an object detection method that can
improve the detection accuracy of an object.
[0006]
An object detection apparatus of the present disclosure includes a microphone array including a
plurality of omnidirectional microphones, and a processor that processes first acoustic data
collected by the microphone array.
The processor sequentially changes the direction of directivity based on the first acoustic data to
generate a plurality of second acoustic data having directivity in any direction, and the sound
pressure level of the second acoustic data And analyze frequency components.
When the sound pressure level of the specific frequency included in the frequency component of
the second acoustic data having directivity in the first direction among the arbitrary directions is
greater than or equal to the first predetermined value, the processor determines the first
direction. It is determined that an object exists in
[0007]
According to the present disclosure, the detection accuracy of an object can be improved.
[0008]
A schematic view showing a schematic configuration example of an object detection system in
the first embodiment A block diagram showing a configuration example of an object detection
03-05-2019
2
system in the first embodiment A timing chart showing an example of a sound pattern of a
moving object registered in a memory A timing chart showing an example of frequency change
of acoustic data obtained as a result of frequency analysis processing. A schematic diagram
showing an example of a state in which a moving object is detected by scanning a pointing range
in a monitoring area. A schematic view showing an example of a state in which a pointing
direction is scanned within a pointing range to detect a moving object. A flowchart showing a
first operation example of a detection processing procedure of a moving object in the first
embodiment. Movement in the first embodiment Flowchart showing Second Operation Example
of Object Detection Processing Procedure Omnidirectional Image Taken by Omnidirectional
Camera in First Embodiment A schematic diagram showing an example A block diagram showing
a configuration of an object detection system in a modification of the first embodiment A
flowchart showing a detection processing procedure of a moving object in a modification of the
first embodiment An object detection system in the second embodiment The block diagram
showing the example of composition of the object detection system in a 2nd embodiment The
timing chart for explaining an example of the method of distance detection in a 2nd embodiment
The example of operation by the object detection system in a 2nd embodiment is shown Flow
chart A schematic view showing an example of an omnidirectional image captured by an
omnidirectional camera in the second embodiment A schematic view showing a schematic
configuration example of an object detection system in the third embodiment An object detection
system according to the third embodiment The block diagram which shows the structural
example. The operation example by the object detection system in 3rd Embodiment is shown.
Flow chart A schematic view showing an example of an image captured by a PTZ camera in the
third embodiment A schematic view showing a schematic configuration example of an object
detection system in the fourth embodiment A configuration example of an object detection
system in the fourth embodiment Flowchart showing an operation example by the object
detection system in the fourth embodiment A schematic view showing a schematic configuration
example of the object detection system in the fifth embodiment A block showing a configuration
example of the object detection system in the fifth embodiment Fig. 2 is a schematic diagram for
explaining an example of a method of detecting a distance to a moving object by using two sound
source detection devices. A flowchart showing an operation example by the object detection
system in the fifth embodiment
[0009]
Hereinafter, embodiments will be described in detail with reference to the drawings as
appropriate.
However, the detailed description may be omitted if necessary.
03-05-2019
3
For example, detailed description of already well-known matters and redundant description of
substantially the same configuration may be omitted.
This is to avoid unnecessary redundancy in the following description and to facilitate
understanding by those skilled in the art.
It is to be understood that the attached drawings and the following description are provided to
enable those skilled in the art to fully understand the present disclosure, and they are not
intended to limit the claimed subject matter.
[0010]
In the flying object monitoring apparatus described in Patent Document 1, a directional
microphone is used as a sound detection unit.
This flying object monitoring device detects sounds in each direction by turning one directional
microphone or by installing a plurality of directional microphones in each direction covering the
monitoring area.
[0011]
When one directional microphone is turned, it takes time to turn and it is difficult to detect the
sound in each direction at the same time. Therefore, when the object moves during turning, the
detection accuracy of the object decreases.
[0012]
When a plurality of directional microphones are installed in each direction covering the
monitoring area, an area in which detection of an object is difficult (for example, a region in
which adjacent directional microphones can not cover each other) is generated. Sometimes.
03-05-2019
4
When an object is located in this area, the detection accuracy of the object is reduced.
[0013]
Hereinafter, an object detection apparatus, an object detection system, and an object detection
method capable of improving the detection accuracy of an object will be described.
[0014]
First Embodiment [Configuration, Etc.] FIG. 1 is a schematic view showing a schematic
configuration of an object detection system 5 according to a first embodiment.
The object detection system 5 detects a moving object dn. The moving object dn is an example of
a detection target (target). Examples of the moving object dn include a drone, a radio control
helicopter, and an unmanned reconnaissance aircraft.
[0015]
In this embodiment, a multicopter drone having a plurality of rotors (rotor blades) mounted
thereon is illustrated as the moving object dn. In the multi-copter type drone, generally, when the
number of rotor blades is two, harmonics of a frequency twice as high as that of a specific
frequency and further harmonics of the frequency thereof are generated. Similarly, when the
number of rotor blades is three, harmonics of a frequency three times the frequency of the
specific frequency, and further, harmonics of the frequency of its multiplication are generated.
The same applies to the case where the number of rotor wings is four or more.
[0016]
The object detection system 5 has a configuration including a sound source detection device 30,
a control box 10, and a monitor 50. The sound source detection device 30 has a microphone
array MA and an omnidirectional camera CA.
[0017]
03-05-2019
5
The sound source detection device 30 uses the microphone array MA to pick up sounds in all
directions in a sound collection space (sound collection area) in which the device itself is
installed. The sound source detection device 30 has a housing 15 in which an opening is formed
at the center, and a microphone array MA. Sound includes, for example, mechanical sounds,
sounds, and other sounds.
[0018]
The microphone array MA includes a plurality of nondirectional microphones M1 to M8
arranged at predetermined intervals (for example, uniform intervals) concentrically along the
circumferential direction around the opening of the housing 15 Including. As a microphone, for
example, an electret condenser microphone (ECM: Electret Condenser Microphone) is used. The
microphone array MA sends the acoustic data of the collected sound to subsequent components
of the microphone array MA. The arrangement of the microphones M1 to M8 described above is
an example, and may be another arrangement or shape.
[0019]
The microphone array MA also includes a plurality of amplifiers (amplifiers) that respectively
amplify output signals of the plurality of microphones M1 to Mn (for example, n = 8) and the
plurality of microphones M1 to Mn. An analog signal output from each amplifier is converted to
a digital signal by an A / D converter 31 described later.
[0020]
The number of microphones in the omnidirectional microphone is not limited to eight, and may
be another number (for example, 16 or 32).
[0021]
An omnidirectional camera CA is accommodated inside the opening of the housing 15 of the
microphone array MA.
03-05-2019
6
The omnidirectional camera CA is a camera equipped with a fisheye lens capable of capturing an
omnidirectional image. The omnidirectional camera CA functions as, for example, a surveillance
camera capable of imaging an imaging space (imaging area) in which the sound source detection
device 30 is installed. That is, the omnidirectional camera CA has an angle of view of 180 ° in
the vertical direction and 360 ° in the horizontal direction, and images, for example, a
monitoring area 8 (see FIG. 5), which is a half sphere, as an imaging area.
[0022]
In the sound source detection device 30, the omnidirectional camera CA and the microphone
array MA are coaxially arranged by incorporating the omnidirectional camera CA inside the
opening of the housing 15. As described above, when the optical axis of the omnidirectional
camera CA coincides with the central axis of the microphone array MA, the imaging area and the
sound collecting area in the axial direction (horizontal direction) become substantially the same,
and the image position and the sound collecting The position can be expressed in the same
coordinate system.
[0023]
The sound source detection device 30 is attached, for example, in such a manner that the upward
direction in the top-to-bottom direction is the sound collecting surface and the imaging surface,
in order to detect the moving object dn flying from the sky.
[0024]
The sound source detection device 30 forms directivity (beam forming) in any direction with
respect to the omnidirectional sound collected by the microphone array MA, and emphasizes the
sound in the directional direction.
The technology relating to directivity control processing of acoustic data for beamforming the
sound picked up by the microphone array MA is a known technology as shown in, for example,
Patent Literatures 1 and 2. (Reference patent document 1: JP-A-2014-143678, reference patent
document 2: JP-A-2015-029241)
03-05-2019
7
[0025]
The sound source detection device 30 processes an imaging signal involved in imaging using the
omnidirectional camera CA to generate an omnidirectional image.
[0026]
The control box 10 outputs predetermined information to, for example, the monitor 50 based on
the image based on the sound collected by the sound source detection device 30 and the image
based on the image captured by the omnidirectional camera CA.
For example, the control box 10 causes the monitor 50 to display an omnidirectional image and a
sound source direction image sp1 (see FIG. 9) of the detected moving object dn. The control box
10 is configured by, for example, a PC (Personal Computer) or a server.
[0027]
The monitor 50 displays an omnidirectional image captured by the omnidirectional camera CA.
The monitor 50 also generates and displays a composite image in which the sound source
direction image sp1 is superimposed on the omnidirectional image. The monitor 50 may be
configured as an integrated device with the control box 10.
[0028]
In FIG. 1, in the sound source detection device 30, the omnidirectional camera CA, and the
control box 10, the respective devices are connected to the control box 10 without a network,
and data is transmitted. That is, each device has a communication interface. The devices may be
communicably connected to each other via a network. The network may be a wired network (for
example, an intranet, the Internet, a wired LAN (Local Area Network)) or a wireless network (for
example, a wireless LAN).
[0029]
FIG. 2 is a block diagram showing the configuration of the object detection system 5.
03-05-2019
8
[0030]
The sound source detection device 30 includes an image sensor 21, an imaging signal processing
unit 22, and a camera control unit 23.
The sound source detection device 30 includes a microphone array MA, an A / D converter 31, a
buffer memory 32, a directivity processing unit 33, a frequency analysis unit 34, an object
detection unit 35, a detection result determination unit 36, a scan control unit 37, and A
detection direction control unit 38 is provided.
[0031]
The image sensor 21, the imaging signal processing unit 22, and the camera control unit 23
operate as an omnidirectional camera CA and belong to a system (image processing system) that
processes an image signal. The A / D converter 31, the buffer memory 32, the directivity
processing unit 33, the frequency analysis unit 34, the object detection unit 35, the detection
result determination unit 36, the scan control unit 37, and the detection direction control unit 38
It belongs to a system (sound processing system) to be processed.
[0032]
The processor 25 executes the program held in the memory 32A to realize the functions of the
imaging signal processing unit 22 and the camera control unit 23. The processor 26 executes the
program held in the memory 32A, whereby the directivity processing unit 33, the frequency
analysis unit 34, the object detection unit 35, the detection result determination unit 36, the scan
control unit 37, and the detection direction control unit 38. Realize each function of
[0033]
The image sensor 21 is a solid-state imaging device such as a charge coupled device (CCD) or a
complementary metal oxide semiconductor (CMOS). The image sensor 21 captures an image
(omnidirectional image) formed on an imaging surface through a fisheye lens.
03-05-2019
9
[0034]
The imaging signal processing unit 22 converts a signal of an image captured by the image
sensor 21 into an electrical signal, and performs various image processing. The camera control
unit 23 controls each unit of the omnidirectional camera CA, and supplies, for example, a timing
signal to the image sensor 21.
[0035]
The A / D converter 31 performs A / D conversion (analog-digital conversion) on the acoustic
signals respectively output from the microphones M1 to M8 of the microphone array MA, and
generates and outputs digital value acoustic data. . The number of A / D converters 31 is equal to
the number of microphones.
[0036]
The buffer memory 32 is configured by a RAM (Random Access Memory) or the like. The buffer
memory 32 temporarily stores acoustic data collected by the microphones M1 to M8 of the
microphone array MA and converted into digital values by the A / D converter 31. The buffer
memory 32 is provided in the same number as the number of microphones.
[0037]
The memory 32A is connected to the processor 26, and is configured by a ROM (Read Only
Memory) and a RAM. The memory 32A holds, for example, various data, setting information, and
programs. The memory 32A has a pattern memory in which a sound pattern unique to each
moving object dn is registered.
[0038]
FIG. 3 is a timing chart showing an example of the sound pattern of the moving object dn
registered in the memory 32A.
03-05-2019
10
[0039]
The sound pattern shown in FIG. 3 is a combination of frequency patterns, and includes sounds
of four frequencies f1, f2, f3 and f4 generated by rotation of four rotors mounted on a
multicopter type moving object dn.
Each frequency is, for example, the frequency of the sound generated with the rotation of a
plurality of wings pivotally supported by each rotor.
[0040]
In FIG. 3, the area of the frequency indicated by hatching is the area of high sound pressure. The
sound pattern may include not only the number of sounds of a plurality of frequencies and the
sound pressure, but also other sound information. For example, a sound pressure ratio
representing the ratio of sound pressure at each frequency may be mentioned. Here, as an
example, the detection of the moving object dn is determined based on whether the sound
pressure of each frequency included in the sound pattern exceeds a threshold.
[0041]
The directivity processing unit 33 uses the acoustic data collected by the nondirectional
microphones M1 to M8 to perform the directivity forming process (beamforming) described
above, and extracts the acoustic data with any direction as the directivity direction. Do the
processing. Further, the directivity processing unit 33 performs sound data extraction processing
in which a range in an arbitrary direction is the directivity range. The directional range is a range
including a plurality of adjacent directional directions, and is intended to include a certain extent
of directional direction as compared with the directional direction.
[0042]
The frequency analysis unit 34 performs frequency analysis processing on the acoustic data
03-05-2019
11
extracted by the directivity processing unit 33 in the directivity range or the directivity direction.
In this frequency analysis process, the frequency included in the sound data of the pointing
direction or the pointing range and the sound pressure thereof are detected.
[0043]
FIG. 4 is a timing chart showing frequency changes of acoustic data obtained as a result of
frequency analysis processing.
[0044]
In FIG. 4, four frequencies f11, f12, f13 and f14 and sound pressures of the respective
frequencies are obtained as sound data.
In the figure, the fluctuation of each frequency which changes irregularly occurs, for example,
due to the rotation of the rotor (rotor blade) which slightly changes when performing attitude
control of the moving object dn.
[0045]
The object detection unit 35 performs detection processing of the moving object dn. In the
detection process of the moving object dn, the object detection unit 35 generates a sound pattern
(refer to FIG. 4) (frequency f11 to f14) obtained as a result of the frequency analysis process and
a sound pattern registered in advance in the pattern memory of the memory 32A. (Refer to FIG.
3) (frequencies f1 to f4). The object detection unit 35 determines whether or not the sound
patterns of the two are similar.
[0046]
For example, it is determined as follows whether or not the two patterns are similar. If the sound
pressure of at least two frequencies included in the sound data out of the four frequencies f1, f2,
f3, and f4 exceeds the threshold value, the object detection unit 35 determines that the sound
patterns are similar. Detect an object dn. The moving object dn may be detected when other
conditions are satisfied.
03-05-2019
12
[0047]
When it is determined that the moving object dn does not exist, the detection result control unit
36 detects the moving direction dn in the next pointing range without changing the size of the
pointing range. Direct to 38.
[0048]
If it is determined that the moving object dn is present as a result of the scanning of the pointing
range, the detection result determination unit 36 instructs the detection direction control unit 38
to reduce the range of beamforming for object detection.
That is, the detection result determination unit 36 instructs to change the beamforming range
from the pointing range to the pointing direction. A plurality of pointing ranges may be provided,
and the range of beamforming may be reduced stepwise each time a moving object dn is
detected.
[0049]
The detection result determination unit 36 notifies the system control unit 40 of the detection
result of the moving object dn when it is determined that the moving object dn is present as a
result of the scanning in the pointing direction. The detection result includes information on the
detected moving object dn. The information of the moving object dn includes, for example,
identification information of the moving object dn, and position information (direction
information) of the moving object dn in the sound collection space.
[0050]
By making the range of beam forming variable, the sound source detection device 30 can
improve the efficiency of the substance detection operation. Information on the range of
beamforming and information on the method of reducing the range of beamforming are held, for
example, in the memory 32A.
03-05-2019
13
[0051]
The detection direction control unit 38 controls the direction in which the moving object dn is
detected in the sound collection space based on the instruction from the detection result
determination unit 36. For example, the detection direction control unit 38 sets an arbitrary
direction or range as a detection direction or a detection range in the entire sound collection
space.
[0052]
The scan control unit 37 instructs the directivity processing unit 33 to beam form the detection
range and the detection direction set by the detection direction control unit 38 as the directivity
range and the directivity direction.
[0053]
The directivity processing unit 33 beamforms the directivity range and the directivity direction
(for example, the next directivity range in scanning) instructed from the scan control unit 37.
[0054]
The control box 10 includes a system control unit 40.
The processor 45 of the control box 10 executes the program stored in the memory 46 to
implement the function of the system control unit 40.
[0055]
The system control unit 40 controls the cooperative operation of the image processing system,
the sound processing system, and the monitor 50 of the sound source detection device 30.
For example, the system control unit 40 superimposes an image or the like indicating the
position of the moving object dn on the image obtained by the omnidirectional camera CA based
03-05-2019
14
on the information of the moving object dn from the detection result determining unit 36, and Is
output to the monitor 50.
[0056]
[Operation etc.] Next, the detection operation of the moving object dn by the object detection
system 5 will be described.
[0057]
Here, the first operation and the second operation will be described.
The first operation is an operation of scanning the sound collection area by dividing the range of
beam forming by the sound source detection device 30 into two steps when detecting the
presence from the sound pressure of the sound emitted from the moving object dn. That is, after
scanning in the pointing range, scanning is performed in the pointing direction. The second
operation is an operation of scanning the sound collection area with the range of beamforming
by the sound source detection device 30 fixed. That is, scanning is performed in the pointing
direction from the beginning. Although it is illustrated that the sound collecting area is the same
as the monitoring area 8, the sound collecting area may not be the same as the monitoring area
8.
[0058]
(First Operation Example) In the first operation example, the sound source detection device 30
detects the moving object dn in consideration of the pointing range BF1. That is, the directivity
processing unit 33 beamforms the sound data collected by the microphone array MA in the
monitoring area 8 toward the directivity range BF1. Further, in the first directivity range dr1 in
which the moving object dn exists, the acoustic data collected by the microphone array MA is
beamformed toward the directivity direction BF2.
[0059]
FIG. 5 is a schematic diagram showing how the monitoring area 8 is scanned to detect a moving
object dn in an arbitrary pointing range BF1.
03-05-2019
15
[0060]
In FIG. 5, the processor 26 sequentially scans a plurality of pointing ranges BF1 within the
monitoring area 8 for any pointing range BF1.
For example, when the processor 26 detects the moving object dn in the first pointing range dr1
in the monitoring area 8, it determines that the moving object dn is present in the detected first
pointing range dr1. Then, the processor 26 sequentially scans an arbitrary pointing direction BF2
narrower than the first pointing range dr1 in the first pointing range dr1.
[0061]
FIG. 6 is a schematic view showing how a moving object dn is detected in an arbitrary pointing
direction BF2 by scanning the first pointing range dr.
[0062]
In FIG. 6, the processor 26 sequentially scans a plurality of pointing directions BF2 from a
plurality of pointing directions BF2 within the first pointing range dr1 in which the moving
object dn is detected.
For example, when the object detection unit 35 detects that the sound pressure of the specific
frequency is greater than or equal to the predetermined value th1 in the first directivity direction
dr2 in the first directivity range dr1, the object detection unit 35 moves in the first directivity
direction dr2 It determines that dn exists.
[0063]
FIG. 7 is a flowchart showing a first operation example of a detection processing procedure of the
moving object dn by the sound source detection device 30.
[0064]
03-05-2019
16
First, the directivity processing unit 33 sets the directivity range BF1 to the initial position (S1).
At this initial position, an arbitrary pointing range BF1 is set as the pointing range to be scanned.
Further, the directivity processing unit 33 may set the directivity range BF1 to any size.
[0065]
The directivity processing unit 33 determines whether the acoustic data collected by the
microphone array MA and converted into digital values by the A / D converter 31 is temporarily
stored (buffered) in the buffer memory 32 or not. It judges (S2). If not stored, the directivity
processing unit 33 returns to the process of S1.
[0066]
When the acoustic data is stored in the buffer memory 32, the directivity processing unit 33
beamforms the monitoring area 8 to an arbitrary directivity range BF1 (the first time is the
directivity range of the initial setting), and Acoustic data is extracted (S3).
[0067]
The frequency analysis unit 34 detects the frequency of the acoustic data extracted and
processed into the directivity range BF1 and the sound pressure thereof (frequency analysis
processing) (S4).
[0068]
The object detection unit 35 compares the sound pattern registered in the pattern memory of the
memory 32A with the sound pattern obtained as a result of the frequency analysis process
(detection process of the moving object dn) (S5).
[0069]
The detection result determination unit 36 notifies the system control unit 40 of the result of the
comparison, and notifies the detection direction control unit 38 of the detection direction shift
(detection result determination process) (S6).
[0070]
03-05-2019
17
For example, the object detection unit 35 compares the sound pattern obtained as a result of the
frequency analysis process with the four frequencies f1, f2, f3, and f4 registered in the pattern
memory of the memory 32A.
If the object detection unit 35 has at least two identical frequencies in both sound patterns as a
result of comparison and the sound pressure of these frequencies is a predetermined value th1
or more, the two sound patterns approximate and move It is determined that the object dn is
present.
[0071]
Here, it is assumed that at least two frequencies coincide, but the object detection unit 35
approximates the case where one frequency coincides and the sound pressure at this frequency
is equal to or higher than a predetermined value th1. It may be determined that
[0072]
Further, the object detection unit 35 may set an error of an allowable frequency for each
frequency, and determine the presence or absence of the above approximation, assuming that the
frequency within this error range is the same frequency.
[0073]
Further, in addition to the comparison of the frequency and the sound pressure, the object
detection unit 35 may determine that the sound pressure ratios of the sounds of the respective
frequencies substantially match to the determination condition.
In this case, since the determination condition becomes severe, the sound source detection device
30 can easily identify the detected moving object dn as the pre-registered target object (moving
object dn), and the detection accuracy of the moving object dn is improved. it can.
[0074]
03-05-2019
18
As a result of S6, the detection result determination unit 36 determines whether the moving
object dn is present or absent (S7).
In addition, one process may be sufficient as S6 and S7.
[0075]
When there is no moving object dn, the scan control unit 37 moves the pointing range BF1 of the
scanning target in the monitoring area 8 to the next range (S8).
[0076]
The order of sequentially moving the pointing range BF1 in the monitoring area 8 is, for example,
from the outer circumference to the inner circumference in the monitoring area 8 or from the
inner circumference to the outer circumference. As such, it may be in a spiral (spiral) order.
[0077]
As described above, the sound source detection device 30 scans the directivity range BF1 having
a certain extent of directivity direction in the monitoring area 8 to determine whether the moving
object dn is present in the monitoring area 8 or not. Time can be reduced.
[0078]
Further, instead of scanning continuously as in the case of one-stroke writing, the position may
be set in advance in the monitoring area 8, and the pointing range BF1 may be moved to each
position in an arbitrary order.
Thereby, the sound source detection device 30 can start the detection process from, for example,
a position where the moving object dn easily intrudes, and the detection process can be made
efficient.
[0079]
The scan control unit 37 determines whether the scan of all directions in the monitoring area 8
03-05-2019
19
has been completed (S9).
If the scanning of all directions has not been completed, the directivity processing unit 33
returns to the process of S3 and performs the same operation.
That is, the directivity processing unit 33 beamforms the directivity range of the position moved
in S8, and extracts acoustic data of the directivity range.
[0080]
On the other hand, when it is determined in S7 that the moving object dn is present, the
directivity processing unit 33 determines any directivity direction BF2 (1 in the first directivity
range dr1 in which the moving object dn is detected (see FIG. 5). The beamforming is performed
in the initial direction (direction of orientation), and the acoustic data of the direction of
orientation BF2 is extracted (S10).
[0081]
The frequency analysis unit 34 detects the frequency of the acoustic data extracted and
processed in the directional direction BF2 and the sound pressure thereof (frequency analysis
processing) (S11).
[0082]
The object detection unit 35 compares the sound pattern registered in the pattern memory of the
memory 32A with the sound pattern obtained as a result of the frequency analysis process.
If the object detection unit 35 determines that the sound patterns are similar to each other as a
result of the comparison, the object detection unit 35 determines that the moving object dn is
present, and if it is determined that the sound patterns are not similar to each other It is
determined that the object dn does not exist (detection process of the moving object dn) (S12).
[0083]
03-05-2019
20
For example, the sound pattern obtained as a result of the frequency analysis process and the
four frequencies f1, f2, f3, and f4 registered in the pattern memory of the memory 32A have at
least two identical frequencies, and When the sound pressure of the frequency of is higher than
or equal to the predetermined value th2, the object detection unit 35 determines that the sound
patterns of the two are close to each other and the moving object dn is present.
The predetermined value th2 is, for example, equal to or more than the predetermined value th1.
[0084]
As described above, when the sound pressure of acoustic data having the same frequency as the
frequency registered in the sound pattern is equal to or more than the predetermined value th2,
the detection result determination unit 36 determines that the moving object dn is present.
The other determination methods are the same as in S5.
[0085]
The detection result determination unit 36 notifies the system control unit 40 of the result of the
comparison by the object detection unit 35 and notifies the detection direction control unit 38 of
the detection direction shift (detection result determination process) (S13).
[0086]
If there is a moving object dn in S13, the detection result determination unit 36 notifies the
system control unit 40 that there is a moving object dn (detection result of the moving object dn).
It should be noted that the notification of the detection result of the moving object dn is not a
timing at which the detection processing of one pointing direction is finished, but a line after the
scanning of the pointing direction in one pointing range BF1 is completed or after the
omnidirectional scanning is completed. It may be
03-05-2019
21
[0087]
The scan control unit 37 moves an arbitrary pointing direction BF2 in the direction of the next
scanning target in the first pointing range dr1 (S14).
[0088]
The detection result determination unit 36 determines whether the scan in the first pointing
range dr1 is completed (S15).
When the scan in the first directivity range dr1 is not completed, the directivity processing unit
33 returns to the process of S10.
[0089]
When the scan in the first directivity range dr1 is completed in S15, the directivity processing
unit 33 proceeds to the process of S8, and repeats the above process until the scan of all
directions in the monitoring area 8 is completed in S9.
As a result, even if one moving object dn is detected, the sound source detection device 30
continues detection of the moving objects dn that may be present elsewhere, so detection of a
plurality of moving objects dn is possible. .
[0090]
When the omnidirectional scan is completed in S9, the directivity processing unit 33 erases the
acoustic data temporarily stored in the buffer memory 32 and collected by the microphone array
MA (S16).
[0091]
After erasing the acoustic data, the processor 26 determines whether the process of detecting the
moving object dn ends (S17).
03-05-2019
22
The end of the process of detecting the moving object dn is performed in response to a
predetermined event.
For example, the processor 26 holds, in the memory 32A, the number of times the moving object
dn has not been detected in S6 or S13, and ends this process of detecting the moving object dn in
FIG. May be Further, the processor 26 may end the process of detecting the moving object dn in
FIG. 7 based on the time-up by a timer or the user's operation on a UI (User Interface) (not
shown) of the control box 10. In addition, when the power source of the sound source detection
device 30 is turned off, the process may be ended.
[0092]
In S4 and S11, the frequency analysis unit 34 analyzes the frequency and also measures the
sound pressure of the frequency. The detection result determination unit 36 determines that the
moving object dn approaches the sound source detection device 30 if the sound pressure level
measured by the frequency analysis unit 34 gradually increases with the passage of time. Good.
[0093]
For example, when the sound pressure level of the predetermined frequency measured at time
t11 is smaller than the sound pressure level of the same frequency measured at time t12 after
time t11, the sound pressure increases with time. The moving object dn may be determined to be
approaching. Also, the sound pressure level may be measured three times or more, and it may be
determined that the moving object dn is approaching based on the transition of the statistical
value (dispersion value, average value, maximum value, minimum value, etc.) .
[0094]
In addition, when the measured sound pressure level is equal to or higher than the
predetermined value th3 that is the alert level, the detection result determination unit 36 may
determine that the moving object dn has invaded the alert area.
[0095]
03-05-2019
23
The predetermined value th3 is, for example, equal to or greater than the predetermined value
th2.
The alert area is, for example, the same area as the monitoring area 8 or an area included in the
monitoring area 8 and narrower than the monitoring area 8. The alert area is, for example, an
area in which the intrusion of the moving object dn is restricted. In addition, the approach
determination and the intrusion determination of the moving object dn may be performed by the
system control unit 40.
[0096]
Second Operation Example In the second operation example, the sound source detection device
30 sequentially scans the pointing direction BF2 in the monitoring area 8 without considering
the pointing range BF1, and detects the moving object dn.
[0097]
FIG. 8 is a flowchart showing a second operation example of the detection processing procedure
of the moving object dn by the sound source detection device 30.
The same processing as that of the first operation example shown in FIG. 7 is denoted by the
same step number, and the description thereof is omitted.
[0098]
First, the directivity processing unit 33 sets the directivity direction BF2 to the initial position
(S1A). At this initial position, an arbitrary pointing direction BF2 is set to the pointing direction of
the scanning target.
[0099]
When the acoustic data collected by the microphone array MA is temporarily stored in the buffer
memory 32 in S2, the directivity processing unit 33 determines an arbitrary directivity direction
BF2 in the monitoring area 8 (the first time directivity is initially set. Beam forming in the
03-05-2019
24
direction), and extraction processing of acoustic data in the pointing direction BF2 (S3A).
[0100]
In S7, according to the determination result of the presence of the moving object dn by the object
detection unit 35, the detection control unit 37 determines the pointing direction of the scanning
target in the monitoring area 8 if the moving object dn does not exist. The BF 2 is moved in the
next direction (S 8 A).
[0101]
On the other hand, when there is a moving object dn in S7, the detection result judging unit 36
notifies the system control unit 40 that there is a moving object dn (detection result of the
moving object dn) (S7A).
After this, the process proceeds to the process of S8.
The notification of the detection result of the moving object dn may be collectively performed
after completion of the omnidirectional scanning, not at the timing when the detection
processing of one pointing direction is finished.
[0102]
As described above, in the second operation example, the sound source detection device 30 does
not have to switch between scanning using the directivity range BF1 and scanning using the
directivity direction BF2, and processing can be simplified.
[0103]
FIG. 9 is a schematic view showing an omnidirectional image GZ1 captured by the
omnidirectional camera CA.
[0104]
In FIG. 9, the omnidirectional image GZ1 includes the moving object dn flying from the valley of
03-05-2019
25
the building Bl.
The monitor 50 is displayed such that, for example, a sound source direction image sp1 having a
mechanical sound of the moving object dn as a sound source is superimposed on the
omnidirectional image GZ1.
Here, the sound source direction image sp1 is indicated by a rectangular dotted line frame. The
monitor 50 may indicate position information by displaying position coordinates of the moving
object dn on the omnidirectional image GZ1 instead of displaying the sound source direction
image sp1. The system control unit 40, for example, performs processing relating to generation
and superposition of the sound source direction image sp1.
[0105]
[Effects, Etc.] As described above, the object detection system 5 of the first embodiment includes
the sound source detection device 30. The sound source detection device 30 includes a
microphone array MA (microphone array) including a plurality of nondirectional microphones
M1 to M8, and a processor 26 that processes first acoustic data collected by the microphone
array MA. The processor 26 sequentially changes the pointing direction BF2 (direction of
directivity) based on the first sound data to generate a plurality of second sound data having
directivity in any pointing direction BF2, The sound pressure level and frequency component of
the acoustic data of 2 are analyzed. If the sound pressure level of the specific frequency included
in the frequency component of the second acoustic data having directivity in the first directivity
direction dr2 of the arbitrary directivity directions BF2 is the processor value of It is determined
that the moving object dn is present in the pointing direction dr2 of 1. The sound source
detection device 30 is an example of an object detection device. The moving object dn is an
example of an object.
[0106]
Thereby, the sound source detection device 30 can collect sound from the moving object dn
without rotating the sound source detection device 30, for example, by using the nondirectional
microphone. Further, in the sound source detection device 30, the sound source detection device
30 picks up the sound in all directions at one time, so the difference in the sound collection time
of each direction disappears, and the sound at the same timing can be detected. Moreover, since
03-05-2019
26
it becomes difficult to generate | occur | produce the area | region where object detection is
difficult, the sound source detection apparatus 30 can improve the sensitivity of object detection.
Therefore, the sound source detection device 30 can improve the detection accuracy of the
moving object dn.
[0107]
Further, the processor 26 may sequentially change the pointing range BF1 based on the first
sound data to generate a plurality of third sound data having directivity in an arbitrary pointing
range BF1. When the sound pressure level of the specific frequency included in the frequency
component of the third acoustic data having directivity in the first directivity range dr1 of the
arbitrary directivity range BF1 is the predetermined value th1 or more, the processor 26
performs directivity The scanning of the range BF1 may be switched to the scanning of the
pointing direction BF2. That is, the processor 26 sequentially changes the pointing direction BF2
based on the first acoustic data, to thereby generate a plurality of second acoustic data having
directivity in any pointing direction BF2 included in the first pointing range dr1. May be
generated. If the sound pressure level of the specific frequency included in the frequency
component of the second acoustic data having directivity in the first directivity direction dr2 of
the arbitrary directivity directions BF2 is the processor value of It may be determined that the
moving object dn is present in one pointing direction dr2. The pointing range BF1 is an example
of the direction range.
[0108]
As a result, by scanning the pointing direction BF2 after scanning the pointing range BF1, the
sound source detection device 30 can improve the scanning (scanning) efficiency and can
shorten the scanning time.
[0109]
Further, when the sound pressure level of the sound data having directivity in the first directivity
direction dr2 and the sound pattern of the frequency component approximate to a
predetermined pattern, the processor 26 causes the moving object dn to move in the first
directivity direction dr2. It may be determined that it exists.
The predetermined pattern is, for example, a sound pattern stored in the memory 32A.
03-05-2019
27
[0110]
Thereby, when the characteristic (sound pattern) of the sound emitted by the moving object dn is
known in advance, the sound source detection device 30 can specify the moving object dn, and
can further improve the detection accuracy of the moving object dn.
[0111]
Also, the processor 26 may detect the approach of the moving object dn when the sound
pressure level at a specific frequency registered in the memory 32A, which the moving object dn
emits, increases with the passage of time.
Further, the processor 26 may determine that the moving object dn is present in the alert area
when the approach of the moving object dn is detected and the sound pressure level of the
specific frequency is equal to or higher than the predetermined value th3. The alert area is an
example of a predetermined area.
[0112]
Thereby, the sound source detection device 30 can notify the approach of the moving object dn
by display or the like.
[0113]
The object detection system 5 may also include a sound source detection device 30, an
omnidirectional camera CA, a control box 10, and a monitor 50.
The sound source detection device 30 transmits the determination result of the presence of the
moving object dn to the control box 10. The omnidirectional camera CA captures an image
having an omnidirectional angle of view. Under the control of the control box 10, the monitor 50
superimposes and displays position information of the moving object dn determined to be
present in the first directivity direction dr2 on the omnidirectional image captured by the
omnidirectional camera CA. The omnidirectional camera CA is an example of a first camera. The
control box 10 is an example of a control device. The position information of the moving object
03-05-2019
28
dn is, for example, a sound source direction image sp1.
[0114]
Thereby, the object detection system 5 can visually confirm the position of the moving object dn
with respect to the monitoring area 8, for example.
[0115]
(Modification of First Embodiment) FIG. 10 is a block diagram showing a configuration of an
object detection system 5A in a modification of the first embodiment.
The object detection system 5A includes a sound source detection device 30A instead of the
sound source detection device 30 as compared to the object detection system 5. The sound
source detection device 30A newly includes a sound source direction detection unit 39 as
compared to the sound source detection device 30.
[0116]
The sound source direction detection unit 39 estimates the sound source position according to a
known whitening cross correlation method (CSP (Cross-power Spectrum Phase analysis) method).
[0117]
In the CSP method, the sound source direction detection unit 39 divides the monitoring area 8
into a plurality of blocks, and when sound is collected by the microphone array MA, determines
whether there is sound exceeding the threshold for each block. Then, the acoustic position in the
monitoring area 8 is estimated.
[0118]
The process of estimating the sound source position by the CSP method corresponds to the
process of scanning the pointing range BF1 described above and detecting the moving object dn.
03-05-2019
29
Therefore, in the modification, when the sound source direction detection unit 39 estimates the
sound source position, the pointing direction BF2 is scanned within the estimated sound source
position (corresponding to the first pointing range dr1), and the light source direction detecting
unit 39 moves in the first pointing direction dr2. Detect an object dn.
[0119]
FIG. 11 is a flowchart showing a detection processing procedure of the moving object dn by the
sound source detection device 30 in the modification.
In FIG. 11, the same processing as that of FIG. 7 or 8 is denoted by the same step number, and
the description thereof is omitted. In the modification, S1 and S3 to S9 shown in FIG. 7 regarding
scanning of the pointing range BF1 are omitted.
[0120]
First, the directivity processing unit 33 picks up the sound by the microphone array MA, converts
the sound data into a digital value by the A / D converter 31, and determines whether the sound
data is stored in the buffer memory 32 (see FIG. S2). If no acoustic data is stored, S2 is repeated.
[0121]
When the sound data is stored in the buffer memory 32, the sound source direction detection
unit 39 uses this sound data to estimate the sound source position by the CSP method (S2A).
[0122]
As a result of estimating the sound source position, the sound source direction detection unit 39
determines whether a moving object dn as a sound source is detected (S2B).
[0123]
When the sound source is not detected, the sound source detection device 30 erases the sound
data stored in the buffer memory 32 (S16).
03-05-2019
30
[0124]
When the sound source is detected, the processor 26 performs beamforming in the pointing
direction BF2 at the sound source position (pointing range dr) and sequentially scans to detect
the moving object dn (S10 to S14, S9).
[0125]
Thus, in the sound source detection device 30A, the processor 26 may determine whether or not
the moving object dn is present in the first pointing range dr1 according to the CSP method.
Thereby, the sound source detection device 30A can omit the scanning of the pointing range BF1,
can improve the scanning efficiency, and can shorten the scanning time.
[0126]
Second Embodiment The second embodiment shows the case where a distance measuring device
for measuring the distance to the moving object dn is mounted.
[0127]
[Configuration etc.] FIG. 12 is a schematic view showing a schematic configuration of an object
detection system 5B in the second embodiment.
In the object detection system 5B, the same components as those of the object detection systems
5 and 5A of the first embodiment are denoted by the same reference numerals, and the
description thereof will be omitted or simplified.
[0128]
Similar to the first embodiment, the object detection system 5B includes a sound source detection
device 30 or 30A, a control box 10, and a monitor 50, and further includes a distance
measurement device 60.
[0129]
03-05-2019
31
The distance measuring device 60 measures the distance to the detected moving object dn.
As a method of measuring the distance, for example, a TOF (Time Of Flight) method is used.
In the TOF method, an ultrasonic wave or a laser beam is projected toward the moving object dn,
and the distance from the time until the reflected wave or the reflected light is received is
measured.
Here, the case of using ultrasonic waves is exemplified.
[0130]
FIG. 13 is a block diagram showing the configuration of the object detection system 5B.
The distance measuring device 60 includes an ultrasonic sensor 61, an ultrasonic speaker 62, a
receiving circuit 63, a pulse transmission circuit 64, a distance detection unit 66, a distance
control unit 67, and a PT unit 65. The functions of the distance detection unit 66 and the
distance measurement control unit 67 are realized by the processor 68 executing a
predetermined program.
[0131]
When laser light is used, an invisible light sensor and an invisible light laser diode are used
instead of the ultrasonic sensor 61 and the ultrasonic speaker 62. Non-visible light includes, for
example, infrared light or ultraviolet light.
[0132]
The ultrasonic speaker 62 changes the projection direction of the ultrasonic wave by driving the
03-05-2019
32
PT unit 65, and projects the ultrasonic wave toward the moving object dn. The ultrasonic waves
are projected, for example, in a pulse form.
[0133]
The ultrasonic sensor 61 receives the reflected wave projected by the ultrasonic speaker 62 and
reflected by the moving object dn.
[0134]
The reception circuit 63 processes the signal from the ultrasonic sensor 61 and sends it to the
distance detection unit 66.
[0135]
The pulse transmission circuit 64 generates pulse-like ultrasonic waves projected from the
ultrasonic speaker 62 under the control of the distance measurement control unit 67 and sends
the ultrasonic waves to the ultrasonic speaker 62.
[0136]
The PT unit 65 has a drive mechanism including a motor or the like for pivoting the ultrasonic
speaker 62 in the pan (P) direction and the tilt (T) direction.
[0137]
The distance detection unit 66 detects the distance to the moving object dn based on the signal
from the reception circuit 63 under the control of the distance measurement control unit 67.
For example, the distance detection unit 66 detects the distance to the moving object dn based
on the transmission time when the ultrasonic wave is transmitted from the ultrasonic speaker 62
and the reception time when the reflected wave is received by the ultrasonic sensor 61. And
outputs the detection result of the distance to the distance measurement control unit 67.
[0138]
FIG. 14 is a timing chart for explaining the distance detection method.
03-05-2019
33
[0139]
In FIG. 14, the time difference between the pulse signal (transmission pulse) of the projected
ultrasonic wave and the pulse signal (reception pulse) of the ultrasonic wave reflected by the
moving object dn is shown.
Assuming that the difference between the projection timing t1 and the light reception timing t2
is the time difference Δt, the distance detection unit 66 calculates the distance L to the moving
object dn according to, for example, (Expression 1).
[0140]
Distance L = sound velocity C × time difference Δt / 2 (Equation 1)
[0141]
The sound velocity C can be obtained, for example, by the equation (2) using the temperature T
of the dry air.
Sound velocity C = 331.5 + 0.6 T (Equation 2)
[0142]
A distance measurement control unit 67 controls each part of the distance measurement
apparatus 60.
The distance measurement control unit 67 transmits the information on the distance to the
moving object dn detected by the distance detection unit 66 to the control box 10.
The distance measurement control unit 67 receives information from the control box 10 in the
direction (corresponding to the pointing direction) in which the moving object dn is present, and
03-05-2019
34
the PT unit 65 turns so that the ultrasonic speaker 62 faces the moving object dn. Instruct them
to
[0143]
The control box 10 has a memory 46 in which a guard distance is registered to determine
whether the moving object dn has invaded the guard area.
The system control unit 40 determines whether the distance to the moving object dn detected by
the distance detection unit 66 is within the warning distance registered in the memory 46.
Further, the system control unit 40 may cause the monitor 50 to display an image captured by
the omnidirectional camera CA so as to include information on the distance to the moving object
dn.
[0144]
[Operation etc.] Next, an operation example of the object detection system 5B will be described.
[0145]
FIG. 15 is a flowchart showing an operation example of the object detection system 5B.
Here, although it is illustrated that the sound source detection device is the sound source
detection device 30, the same applies to the sound source detection device 30A (the same applies
hereinafter).
[0146]
First, the object detection system 5B performs a detection process of the moving object dn by the
sound source detection device 30 (S21). The process of S21 is, for example, the process shown in
FIG. 7, FIG. 8 or FIG.
03-05-2019
35
[0147]
When receiving the detection result of the moving object dn from the sound source detection
device 30, the system control unit 40 causes the monitor 50 to display the detection result of the
moving object dn (S22). When the moving object dn is detected, a sound source direction image
sp1 having a mechanical sound generated by the moving object dn as a sound source is
superimposed on the omnidirectional image GZ1 and displayed on the monitor 50, for example,
as shown in FIG. Be done.
[0148]
The system control unit 40 determines whether or not the moving object dn has been detected
based on the detection result of the moving object dn from the sound source detection device 30
(S23). If the moving object dn is not detected, the system control unit 40 returns to the process
of S21.
[0149]
When the moving object dn is detected in S23, the system control unit 40 notifies the distance
measuring device 60 of the information on the position of the detected moving object dn (S24).
Here, the position of the moving object dn corresponds to the direction of the moving object dn
with respect to the sound source detection device 30, and corresponds to the first pointing
direction dr2.
[0150]
The distance measurement control unit 67 drives the PT unit 65, and instructs the direction of
the ultrasonic speaker 62 to be the direction of the notified moving object dn (S25).
[0151]
The distance detection unit 66 detects the distance to the moving object dn under the control of
the distance measurement control unit 67 (S26).
03-05-2019
36
The distance from the distance measuring device 60 to the moving object dn is approximately
the same as the distance from the sound source detection device 30 to the moving object dn. The
distance detection unit 66, for example, projects an ultrasonic wave from the ultrasonic speaker
62 toward the moving object dn, and then, based on the time until the reflected wave is received
by the ultrasonic sensor 61, up to the moving object dn. Measure the distance.
[0152]
The system control unit 40 determines whether the measured distance to the moving object dn is
within the alert distance stored in the memory 46 (S27).
[0153]
If it is within the guard distance, the system control unit 40 notifies the monitor 50 of the
intrusion of the moving object dn into the guard area (S28).
When the monitor 50 is notified of the intrusion of the moving object dn, the monitor 50
displays information indicating that the moving object dn has entered the alert area. Thereby, the
user can recognize that the highly urgent situation has occurred by looking at the screen of the
monitor 50 notified of the intrusion of the moving object dn.
[0154]
After the process of S28, the system control unit 40 returns to S21.
[0155]
On the other hand, when the distance to the moving object dn measured in S27 is equal to or
more than the alert distance, the system control unit 40 performs various processes in FIG. 15
(detection process of the presence of the moving object dn and the distance to the moving object
dn) It is determined whether to end the intrusion determination processing of the moving object
dn) (S29).
[0156]
03-05-2019
37
When the various processes of FIG. 15 are not completed, the system control unit 40 returns to
the process of S21 and repeats the various processes of FIG.
On the other hand, when the various processes of FIG. 15 are ended in S29, the object detection
system 5B ends the process of FIG.
For example, when the power of the control box 10 is turned off, the process may be ended.
[0157]
The system control unit 40 may cause the monitor 50 to display the sound source direction
image sp1 superimposed on the omnidirectional image GZ1 and display distance information up
to the moving object dn.
[0158]
In this case, for example, the system control unit 40 may change the display mode of the sound
source direction image sp1 based on the distance to the moving object dn.
In addition, the system control unit 40 may change the display mode of the sound source
direction image sp1 based on whether or not the moving object dn is present in the alert area.
The display form includes, for example, a display color, a size, a shape, and a type of the sound
source direction image sp1. Also, the distance information may be coordinate information.
[0159]
FIG. 16 is a schematic view showing an omnidirectional image GZ1 captured by the
omnidirectional camera CA.
[0160]
In FIG. 16, similarly to FIG. 9, the omnidirectional image GZ1 includes the moving object dn
flying from the valley of the building Bl.
03-05-2019
38
For example, in the sound source direction image sp1 of the moving object dn, the sound source
direction image sp1 is superimposed on the omnidirectional image GZ1 in a display form
(represented by hatching in the figure) indicating that the moving object dn is located in the alert
area. It may be displayed. In addition, character information indicating the distance to the moving
object dn ("15 m approaching" in FIG. 16) may be displayed.
[0161]
[Effects, etc.] As described above, the distance measuring apparatus 60 changes the distance
measurement direction by driving the PT unit 65, and detects the distance to the moving object
dn existing in the first directivity direction dr2 with the microphone array MA as a base point.
You may The distance measuring device 60 may transmit the detection result of the distance to
the control box 10. The control box 10 may determine that the moving object dn is present in the
alert area if the detected distance is within the alert distance. The PT unit 65 is an example of an
actuator.
[0162]
The function of the system control unit 40 is realized by the processor executing a
predetermined program. The alert distance is an example of a predetermined distance. The alert
area is an example of a predetermined area.
[0163]
Thus, the object detection system 5B can detect the distance to the moving object dn detected by
the sound source detection device 30. Further, the omnidirectional image GZ1 and the sound
source direction image sp1 are displayed on the monitor 50 in a display mode according to the
distance to the moving object dn, whereby the user can determine the position of the moving
object dn (three-dimensional position in the sound collecting space). You can see Furthermore,
the user can recognize the degree of approach of the moving object dn to the alert area, and can
strengthen the monitoring system as needed.
[0164]
03-05-2019
39
Third Embodiment In the third embodiment, a case where a PTZ camera is mounted in addition
to the distance measuring device 60 is shown.
[0165]
[Configuration etc.] FIG. 17 is a schematic view showing a schematic configuration of an object
detection system 5C in the third embodiment.
In the object detection system 5C, the same components as those of the object detection systems
5, 5A, 5B of the first and second embodiments are given the same reference numerals, and the
description thereof will be omitted or simplified.
[0166]
Similar to the second embodiment, the object detection system 5C includes a sound source
detection device 30 or 30A, a control box 10, a monitor 50, and a distance measuring device 60,
and further includes a PTZ camera 70.
[0167]
The PTZ camera 70 is a camera which is rotatable in a pan (P) direction and a tilt (T) direction as
an imaging direction, and in which a zoom magnification (Z) is variable.
The PTZ camera 70 is used, for example, as a surveillance camera.
[0168]
FIG. 18 is a block diagram showing the configuration of the object detection system 5C. The PTZ
camera 70 includes a zoom lens 71, an image sensor 72, an imaging signal processing unit 73, a
camera control unit 74, and a PTZ control unit 75. The processor 77 executes a predetermined
program to realize the functions of the imaging signal processing unit 73 and the camera control
unit 74.
03-05-2019
40
[0169]
The zoom lens 71 is a lens incorporated in a lens barrel (not shown) and capable of changing the
zoom magnification. The zoom lens 71 changes the zoom magnification by driving of the PTZ
control unit 75. Further, the lens barrel pivots in the pan direction and the tilt direction by the
drive of the PTZ control unit 75.
[0170]
The image sensor 72 is a solid-state imaging device such as a CCD or a CMOS. The imaging signal
processing unit 73 converts the signal captured by the image sensor 72 into an electrical signal
and performs various image processing.
[0171]
The imaging signal processing unit 73 converts a signal of an image captured by the image
sensor 72 into an electrical signal, and performs various image processing. The camera control
unit 74 controls the operation of each unit of the PTZ camera 70, and supplies, for example, a
timing signal to the image sensor 72.
[0172]
The PTZ control unit 75 has a drive mechanism such as a motor for changing the pan direction
and tilt direction of the lens barrel and changing the zoom magnification of the zoom lens 71.
[0173]
[Operation etc.] Next, an operation example of the object detection system 5C will be described.
[0174]
FIG. 19 is a flowchart showing an operation example of the object detection system 5C.
03-05-2019
41
Here, with regard to the same process as the process of FIG. 15 of the second embodiment, the
description thereof will be omitted or simplified by attaching the same step number.
[0175]
When the moving object dn is detected in S23, the system control unit 40 notifies the PTZ
camera 70 and the distance measuring device 60 of information on the position of the detected
moving object dn (S24A).
Here, the position of the moving object dn corresponds to the direction of the moving object dn
with respect to the sound source detection device 30, and corresponds to the first pointing
direction dr2.
[0176]
When notified of the information on the position of the moving object dn, the PTZ camera 70
changes the imaging direction in the direction of the moving object dn by driving of the PTZ
control unit 75. Further, the zoom lens 71 changes the zoom magnification so that the moving
object dn is imaged with a predetermined size by driving of the PTZ control unit 75 (S24B).
[0177]
The image sensor 72 acquires image data captured through the zoom lens 71 (S24C). This image
data is subjected to image processing as necessary, and transmitted to the system control unit
40.
[0178]
When the system control unit 40 acquires image data from the PTZ camera 70, the system
control unit 40 causes the monitor 50 to display an image based on the acquired image data
(S24D).
03-05-2019
42
[0179]
FIG. 20 is a schematic view showing an image captured by the PTZ camera 70. As shown in FIG.
[0180]
The PTZ image GZ2 captured by the PTZ camera 70 includes the moving object dn flying over
the building Bl.
The zoom lens 71 changes the zoom magnification by driving the PTZ control unit 75 so that the
size of the moving object dn becomes a predetermined size with respect to the angle of view.
By changing the zoom magnification, the portion surrounded by the rectangle a, which is a part
of the PTZ image GZ2 including the moving object dn, is enlarged and displayed. In the magnified
image GZL, the size of the moving object dn is indicated by a rectangular frame (long Lg ×
horizontal Wd).
[0181]
The process after S25 after the process of S24D is the same as that of the second embodiment.
[0182]
In S29, whether the system control unit 40 ends various processing in FIG. 19 (detection
processing of presence of moving object dn and distance to moving object dn, display processing
of moving object dn, intrusion determination processing of moving object dn) It is determined
whether or not.
When the various processes of FIG. 19 are not completed, the system control unit 40 returns to
the process of S21. On the other hand, when the various processes in FIG. 19 are ended, the
system control unit 40 ends the process in FIG.
[0183]
03-05-2019
43
The system control unit 40 estimates the size of the moving object dn based on the distance to
the moving object dn and the size of the moving object dn occupying the displayed PTZ image
GZ2 or the enlarged image GZL. Good. The memory 46 of the control box 10 may hold in
advance size information (information of the size range) assumed as the size of the object to be
detected.
[0184]
In addition, when the estimated size of the moving object dn is included in the size range held in
the memory 46, the system control unit 40 may further estimate that the moving object dn is a
detection target. In this case, the object detection system 5B can roughly recognize the actual size
of the moving object dn, and can easily identify the type of the moving object dn, for example.
[0185]
[Effects, Etc.] As described above, the PTZ camera 70 may change the imaging direction by
driving the PTZ control unit 75, and may capture the moving object dn present in the first
directivity direction dr2. The control box 10 also determines the size of the moving object dn
based on the size of the area of the moving object dn in the PTZ image GZ2 captured by the PTZ
camera 70 and the distance from the microphone array MA to the moving object dn. It may be
estimated. If the size of the moving object dn is included in a predetermined size, it may be
determined that the moving object dn is a detection target. The PTZ control unit 75 is an
example of an actuator.
[0186]
Thereby, the object detection system 5B can acquire an image mainly including the moving
object dn. Therefore, the user can easily view the features of the moving object dn. In addition to
the sound emitted by the moving object dn, the object detection system 5B can estimate whether
or not the object is a detection object based on the size of the moving object dn, so the detection
accuracy of the moving object dn can be further improved.
[0187]
03-05-2019
44
Fourth Embodiment The fourth embodiment shows an object detection system in which the PTZ
camera 70 is mounted as in the third embodiment, and the distance measuring device 60 is
omitted.
[0188]
[Configuration etc.] FIG. 21 is a schematic view showing a schematic configuration of an object
detection system 5D in the fourth embodiment.
In the object detection system 5D, the same components as those of the object detection systems
5, 5A, 5B, and 5C of the first to third embodiments are given the same reference numerals, and
the description thereof will be omitted or simplified.
[0189]
The object detection system 5D has a configuration including the sound source detection device
30 or 30A, the control box 10, the monitor 50, and the PTZ camera 70.
[0190]
[Operation etc.] Next, an operation example of the object detection system 5D will be described.
[0191]
FIG. 23 is a flowchart showing an operation example of the object detection system 5D.
The process of FIG. 23 is performed by omitting the processes of S25 to S28 in the flowchart
shown in FIG. 19 of the third embodiment.
[0192]
That is, when the moving object dn is detected in S23, the system control unit 40 notifies the PTZ
camera 70 of information on the position of the detected moving object dn (S24A1).
03-05-2019
45
In S24D, when the system control unit 40 causes the monitor 50 to display the image captured
by the PTZ camera 70, the system control unit 40 performs various processes in FIG. 23
(detection process of the moving object dn and moving object dn in S29). It is determined
whether or not to end the display process of. When the various processes of FIG. 23 are not
completed, the system control unit 40 returns to the process of S21. On the other hand, when
various processing of the moving object dn is ended, the system control unit 40 ends the
processing of FIG.
[0193]
[Effects, etc.] As described above, when the moving object dn is detected, the object detection
system 5B can project the moving object dn largely in the image captured by the PTZ camera 70.
Therefore, the user can easily view the features of the moving object dn.
[0194]
Fifth Embodiment The fifth embodiment shows an object detection system provided with a
plurality of (for example, two) sound source detection devices.
[0195]
[Configuration etc.] FIG. 24 is a schematic view showing a schematic configuration of an object
detection system 5E in the fifth embodiment.
In the object detection system 5E, the same components as those of the object detection systems
5, 5A, 5B, 5C, and 5D of the first to fourth embodiments are given the same reference numerals,
and the description thereof is omitted or simplified. .
[0196]
An object detection system 5E of the fifth embodiment is communicably connected to, for
example, a monitoring device 90 installed in a control room in a facility. For example, the control
03-05-2019
46
box 10A is connected to the monitoring device 90 so as to be capable of wired communication or
wireless communication.
[0197]
The object detection system 5E has a configuration including a plurality of (for example, two)
sound source detection devices 30 (30B, 30C), a control box 10A, and a PTZ camera 70.
[0198]
The monitoring device 90 has a configuration including a computer device having a display 91, a
wireless communication device 92, and the like.
The monitoring device 90 displays, for example, an image transmitted from the object detection
system 5E. Thereby, the supervisor can monitor the monitoring area 8 using the monitoring
device 90.
[0199]
FIG. 25 is a block diagram showing the configuration of the object detection system 5E. The
sound source detection device 30B beamforms the omnidirectional sound collected by the
microphone array MA1 and emphasizes the sound in the pointing direction. The sound source
detection device 30C beamforms the omnidirectional sound collected by the microphone array
MA2 and extracts the sound in the pointing direction.
[0200]
The configurations and operations of the sound source detection devices 30B and 30C are the
same as those of the sound source detection device 30 according to the above-described
embodiment.
[0201]
The system control unit 40 determines the pointing direction (the angle β in FIG. 26) in which
the moving object dn is detected by the sound source detection device 30B and the pointing
direction (the angle γ in FIG. 26) in which the moving object dn is detected by the sound source
03-05-2019
47
detection device 30C. The distance to the moving object dn is calculated based on
[0202]
Although the omnidirectional camera CA included in the sound source detection devices 30B and
30C is not shown in FIGS. 24 to 26, the sound source detection devices 30B and 30C are similar
to the first to fourth embodiments. An omnidirectional camera CA is provided.
[0203]
FIG. 26 is a schematic view illustrating a method of detecting the distance to the moving object
dn using the two sound source detection devices 30B and 30C.
[0204]
It is assumed that the distance between the microphone array MA1 and the microphone array
MA2 is known as a length L [m].
In this case, the system control unit 40 sets the distance l1 from the microphone array MA1 to
the moving object dn and the distance l2 from the microphone array MA2 to the moving object
dn to each other using, for example, trigonometry (Equation 3) Calculated by equation 4).
[0205]
l1 = L × sin γ / sin α (Equation 3) l2 = L × sin β / sin α (Equation 4)
[0206]
The control box 10A has a system control unit 40 and a wireless communication unit 55.
The wireless communication unit 55 is wirelessly communicably connected to the wireless
communication device 92 of the monitoring device 90.
The wireless communication unit 55 transmits, for example, the position (detection direction) of
03-05-2019
48
the moving object dn, the distance to the moving object dn, and the image data captured by the
PTZ camera 70, to the monitoring device 90.
Also, the wireless communication unit 55 receives, for example, a remote control signal from the
monitoring device 90 and sends it to the system control unit 40.
[0207]
[Operation etc.] Next, an operation example of the object detection system 5E will be described.
[0208]
FIG. 27 is a flowchart showing an operation example of the object detection system 5E.
In FIG. 27, the same steps as in FIG. 19 of the third embodiment carry the same step numbers,
and the explanation thereof will be omitted or simplified.
[0209]
First, the sound source detection device 30B as the first sound source detection device performs
the process shown in FIG. 7 or 11 in the first embodiment (S21).
[0210]
In the control box 10A, when the system control unit 40 receives the detection result of the
moving object dn from the sound source detection device 30B, the wireless communication unit
55 transmits the detection result of the moving object dn to the monitoring device 90 (S22A).
When receiving the detection result of the moving object dn from the object detection system 5E,
the monitoring device 90 displays the detection result on the display 91.
[0211]
When the moving object dn is not detected in S23, the system control unit 40 returns to the
03-05-2019
49
process of S21.
[0212]
When the moving object dn is detected in S23, the system control unit 40 notifies the PTZ
camera 70 of the information on the position of the moving object dn (S24A1).
[0213]
When acquiring the image obtained in S24C from the PTZ camera 70, the system control unit 40
transmits this image to the monitoring device 90 (S24E).
[0214]
Similarly, the sound source detection device 30C as the second sound source detection device
performs the process shown in FIG. 7 or 11 in the first embodiment (S21A).
[0215]
In the control box 10A, when the system control unit 40 receives the detection result of the
moving object dn from the sound source detection device 30C, the wireless communication unit
55 transmits the detection result of the moving object dn to the monitoring device 90 (S22B).
[0216]
The system control unit 40 determines whether or not the moving object dn is detected based on
the detection result from the sound source detection device 30C (S23A).
When the moving object dn is detected, the system control unit 40 acquires information on the
position of the moving object dn from the detection result of the moving object dn.
[0217]
When the moving object dn is not detected in S23A, the system control unit 40 returns to the
process of S21.
[0218]
03-05-2019
50
When the moving object dn is detected in S23A, the system control unit 40 controls the sound
source detection device 30C and the moving object dn to the sound source detection device 30B
based on the detection direction of the moving object dn detected by the sound source detection
device 30B. The angle β (see FIG. 26) is calculated (S25A).
Similarly, the system control unit 40 determines an angle γ (see FIG. 26) between the sound
source detection device 30B and the moving object dn with respect to the sound source detection
device 30C based on the detection direction of the moving object dn detected by the sound
source detection device 30C. Calculate (S25A).
[0219]
Based on the angles β and γ obtained in S25A and the distance L between the microphone
array MA1 and the microphone array MA2, the system control unit 40, for example, according to
(Equation 3) and (Equation 4), The distance l2 is calculated (S26A).
The distance l1 is a distance from the sound source detection device 30B to the moving object
dn.
The distance l2 is a distance from the sound source detection device 30C to the moving object
dn.
[0220]
The system control unit 40 determines whether the distance l1 or the distance l2 is within the
warning distance lm (S27A).
In addition, the system control unit 40 may determine whether the distance l3 based on the
distances l1 and l2 is within the warning distance lm.
03-05-2019
51
[0221]
If any of the distances l1 to l3 is within the alert distance lm, the system control unit 40 notifies
the monitoring device 90 that the moving object dn has invaded via the wireless communication
unit 55 (S28A).
[0222]
The system control unit 40 may determine that the moving object dn has entered when both of
the distance l1 and the distance l2 are within the alert distance lm.
[0223]
After the process of S28A, the system control unit 40 returns to the process of S21.
[0224]
On the other hand, when all of the distances l1 to l3 exceed the alert distance lm, the system
control unit 40 performs various processes in FIG. 27 (detection processing of the presence of
the moving object dn and the distance to the moving object dn, intrusion of the moving object dn
It is determined whether the determination process is ended (S29).
[0225]
If the detection process of FIG. 27 is not completed, the system control unit 40 returns to the
process of S21 and repeats the various processes of FIG.
On the other hand, when the various processes in FIG. 27 are ended in S29, the object detection
system 5E ends the process in FIG.
[0226]
[Effects, etc.] As described above, the object detection system 5E includes the sound source
detection device 30B that detects the moving object dn using the microphone array MA1, and the
sound source detection device 30C that detects the moving object dn using the microphone array
MA2. May be provided.
03-05-2019
52
The control box 10A includes the pointing direction in which the moving object dn detected by
the sound source detection device 30B exists, the pointing direction in which the moving object
dn detected by the sound source detection device 30C exists, and the distance between the sound
source detection devices 30B and 30C. Based on L, the distances l1 and l2 from the sound source
detection device 30B or the sound source detection device 30C to the moving object dn may be
derived.
The system control unit 40 may determine that the moving object dn is present in the alert area
when the derived distances l1 and l2 are within the alert distance lm.
The sound source detection devices 30B and 30C are examples of an object detection device.
[0227]
Thus, the object detection system 5E can measure the distance to the moving object dn even if
the distance measuring device is omitted by providing a plurality of microphone arrays MA1 and
MA2.
In addition, when the distance to the moving object dn is within the alert distance, the object
detection system 5E can notify the user that the moving object dn is near.
Further, by using a plurality of microphone arrays MA1 and MA2, it is possible to expand the
sound collection area of the sound generated by the moving object dn.
[0228]
In addition, for example, when the distance to the moving object dn is within a predetermined
distance, the monitoring device 90 may output an alert, for example, on the assumption that the
moving object dn has invaded the alert area. The alert may be performed in various ways, such as
display, voice, vibration, etc.
03-05-2019
53
[0229]
(Other Embodiments) As described above, the first to fifth embodiments have been described as
examples of the technology in the present disclosure. However, the technology in the present
disclosure is not limited to this, and can also be applied to embodiments in which changes,
replacements, additions, omissions, and the like have been made. Also, the embodiments may be
combined.
[0230]
In the first to fifth embodiments, the moving object dn is exemplified as the object (target object),
but the moving object dn may be an unmanned flying object or a manned flying object. The
moving object dn is not limited to an object flying in space, but may be an object moving along
the ground. Furthermore, the object may be a stationary object that does not move. In addition,
the relative position between the stationary object and the object detection system is achieved by
moving the carrying device equipped with any of the object detection systems 5, 5A to 5E of the
first to fifth embodiments with respect to the stationary object. The relationship may be changed
to detect a stationary object.
[0231]
In the first to fifth embodiments, the sound emitted by the moving object dn is a sound in the
audible frequency band (20 Hz to 20 kHz) or an ultrasonic wave (20 kHz or more) that is a sound
outside the audible frequency band or Including low frequency sound (less than 20 Hz).
[0232]
In the first to fifth embodiments, the microphone array MA, the control boxes 10 and 10A, the
monitor 50 and the like are configured as individually independent devices, and the object
detection system includes them.
The microphone array MA, the control boxes 10 and 10A, the monitor 50 and the like may be
realized as an object detection device housed in a single case. Such an object detection apparatus
is convenient as a portable apparatus.
03-05-2019
54
[0233]
In the first to fifth embodiments, the processors 25, 26, 26B, 26C are provided in the sound
source detection device, but may be provided in the control boxes 10, 10A.
[0234]
In the first to fifth embodiments, the sound source detection devices 30, 30A, 30B, and 30C are
illustrated to include the omnidirectional camera CA. However, the sound source detection device
30 and the omnidirectional camera CA are separately configured. It is also good.
Also, the omnidirectional camera CA may be omitted.
[0235]
In the first to fifth embodiments, the microphone array MA in the sound source detection device
30 and the processor 26 for processing an acoustic signal are provided in the same case, but
may be provided in another case . For example, the microphone array MA may be included in the
sound source detection device 30, and the processor 26 may be included in the control box 10,
10A.
[0236]
In the first to fifth embodiments, the sound source detection device 30 is attached such that the
upward direction in the vertical direction is the sound collecting surface and the imaging surface,
but the sound source detection device 30 is attached in another direction It is also good. For
example, the sound source detection device 30 may be attached such that the horizontal
direction orthogonal to the vertical direction is the sound collecting surface and the imaging
surface.
[0237]
Although the fifth embodiment exemplifies that the detection result of the moving object dn and
the intrusion notification of the moving object dn are sent to the monitoring device 90, the
03-05-2019
55
monitor 50 is the same as the first to fourth embodiments. It may be done against.
[0238]
In the fifth embodiment, two sound source detectors 30 are illustrated, but the number of sound
source detectors 30 may be determined according to, for example, the alert level of the area
where the sound source detectors 30 are installed. Good.
For example, as the alert level is higher, the number of installed sound source detection devices
30 may be increased, and as the alert level is lower, the number of installed sound source
detection devices 30 may be reduced.
[0239]
Although the fifth embodiment illustrates that the monitoring device 90 is provided separately
from the object detection system 5E, the monitoring device 90 may be included in the object
detection system 5E.
[0240]
In the first to fifth embodiments, the processor may be physically configured in any way.
In addition, since the processing content can be changed by changing the program by using a
programmable processor, the degree of freedom in processor design can be increased. The
processor may be configured of one semiconductor chip or may be physically configured of a
plurality of semiconductor chips. When it comprises a plurality of semiconductor chips, each
control of the first to fifth embodiments may be realized by different semiconductor chips. In this
case, it can be considered that one processor is configured by the plurality of semiconductor
chips. Further, the processor may be configured by a member (such as a capacitor) having a
function different from that of the semiconductor chip. Further, one semiconductor chip may be
configured to realize the function of the processor and the other functions.
[0241]
The present disclosure is useful for an object detection device, an object detection system, an
03-05-2019
56
object detection method, and the like that can improve the detection accuracy of an object.
[0242]
5, 5A, 5B, 5C, 5D, 5E Object detection system 8 Monitoring area 10, 10A Control box 21, 72
Image sensor 22, 73 Image pickup signal processing unit 23, 74 Camera control unit 25, 26,
26B, 26C, 45, 68, 77 processor 30, 30A, 30B, 30C sound source detection device 31 A / D
converter 32 buffer memory 32A, 46 memory 33 directivity processing unit 34 frequency
analysis unit 35 object detection unit 36 detection result determination unit 37 scan control unit
38 detection direction control unit 39 sound source direction detection unit 40 system control
unit 50 monitor 55 wireless communication unit 60 distance measuring device 61 ultrasonic
sensor 62 ultrasonic speaker 63 reception circuit 64 pulse transmission circuit 65 PT unit 66
distance detection unit 67 distance control Unit 70 PTZ camera 71 Zoom lens 72 Image sensor
73 Imaging signal processing 74 camera control unit 75 PTZ control unit 90 monitoring device
91 display 92 wireless communication device BF1, dr1 pointing range BF2, dr2 pointing
direction Bl building CA omnidirectional camera dn moving object GZ1 omnidirectional image
GZ2 PTZ image GZL magnified image Lg vertical MA, MA1, MA2 Microphone array M1 to M8
Microphone sp1 Source direction image Wd Width
03-05-2019
57
Документ
Категория
Без категории
Просмотров
0
Размер файла
74 Кб
Теги
jp2017067666
1/--страниц
Пожаловаться на содержимое документа