CN101657143B - Unitary vision and neuro-processing testing center - Google Patents

Unitary vision and neuro-processing testing center Download PDF

Info

Publication number
CN101657143B
CN101657143B CN2008800118947A CN200880011894A CN101657143B CN 101657143 B CN101657143 B CN 101657143B CN 2008800118947 A CN2008800118947 A CN 2008800118947A CN 200880011894 A CN200880011894 A CN 200880011894A CN 101657143 B CN101657143 B CN 101657143B
Authority
CN
China
Prior art keywords
test
input
visual indicia
neural
vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008800118947A
Other languages
Chinese (zh)
Other versions
CN101657143A (en
Inventor
艾伦·W·瑞秋
瑞安·科尔特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nike Innovate CV USA
Original Assignee
Nike International Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nike International Ltd filed Critical Nike International Ltd
Priority claimed from PCT/US2008/060249 external-priority patent/WO2008128190A1/en
Publication of CN101657143A publication Critical patent/CN101657143A/en
Application granted granted Critical
Publication of CN101657143B publication Critical patent/CN101657143B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

System and methods for testing and/or training a subject's vision and neuro-processing abilities are provided. More specifically, the method may include testing various aspects of the subject's vision and neuro-processing abilities, such as depth perception, anticipation timing, perception speed ability, perception scan ability, etc. By using various tests, an efficient examination may be administered. In accordance with the invention, an individual may be subjected to such a method of testing and/or training at a unitary center capable of presenting such tests to the individual, receiving input from the individual, and processing the received input. Such a unitary test center may further be configurable, so that the tests administered may vary based on the needs of the individual. The received input may then, for example, be used to compute data related to the user's vision and neuro-processing abilities, both overall and for each individual test.

Description

Unitary vision and the neural test center that handles
The cross reference of related application
The application number that the application requires on April 13rd, 2007 to submit to is 60/923,434, and name is called the priority of the U.S. Provisional Patent Application of " system and method that in simulation game, is used for the testing vision ability ", should be incorporated into this by reference in first to file.The application number that the application also requires on June 4th, 2007 to submit to is 60/941,915, and name is called the priority of the U.S. Provisional Patent Application of " system and method that is used for the visual capacity test of damping ", should be incorporated into this by reference in first to file.
About the research of federal government's subsidy or the statement of exploitation
Inapplicable.
Technical field
Present invention relates in general to individual's the vision and the evaluation and/or the training of neural disposal ability.
Background technology
When participating in for example motion and so on movable, individual's vision plays a role in individual's level with physical ability.Typically, in order in motion or activity, to be improved, the individual needs to concentrate the physical ability that improves them, to promote their aggregate level.Yet vision and the coordination ability or acuteness through test and training individual also can improve individual's level.
Brief summary of the invention
The present invention is summarized in this and introduces the selection of notion with simple mode, in the following specific embodiment, further explains.This summary of the invention is not in order to be used for confirming claimed key feature of the present invention or key character, and is non-in order to be used for the auxiliary invention which is intended to be protected of confirming yet.
According to the present invention, a kind of test and/or the vision of training objects and the method for the coordination ability are provided.More specifically, this method can comprise the vision of tested object and the various aspects of the coordination ability.Through utilizing various tests, can implement more efficient inspection.According to the present invention, the individual can be able to present vision and coordinates test, receives from individual's the input and the single formula center of handling this input that receives and accept this test and/or training method to the individual.Single formula test center like this can also dispose further, makes the test that applies to change according to individual human needs.Then, the input that receives for example can be used to calculate the data that are associated with individual's the vision and the coordination ability, and whole test with each personal all can.
Description of drawings
Below with reference to accompanying drawings the present invention is elaborated, wherein:
Fig. 1 is the block diagram that is suitable for the computingasystem environment of embodiment of the present invention;
Fig. 2 representes the block diagram according to the exemplary testing element of one embodiment of the invention;
Fig. 3 representes to be used for the block diagram of the exemplary process element of embodiment of the present invention;
Fig. 4 representes the exemplary unitary vision and coordination test cell according to one embodiment of the invention;
Fig. 5 is according to unitary vision of the present invention and another embodiment of coordinating test cell; And
Fig. 6 representes that the expression according to one embodiment of the invention is used to test the flow chart of method of vision and the coordination ability of the object that is in single formula website.
The specific embodiment
Introduce purport of the present invention in detail at this, to satisfy legal requirement.But explanation itself is not to be used to limit protection scope of the present invention.More properly; The inventor has reckoned with all right embodied in other of the claimed purport of the present invention; Combine with other the present or following technology, thus comprise with this document in the different step of step explained, or with this document in the combination of the similar step of the step explained.
According to the present invention, provide to be used at the vision of single formula test cell tested object and the system and method for the coordination ability.This method can be included in the site of can the result data and/or through network transfer of data being used to another to handle single formula test cell tested object vision and the coordination ability various aspects (for example; Eye-hand coordination; Take sb's mind off sth response time, coordination of body etc.).Through these operations, single formula test center can make the test process effective percentage more of vision and the coordination ability of object, and can reduce the expense (for example, having reduced device) of implementing the test needs.In addition, single formula test center can also be provided with, and makes the test of implementing to change according to individual human needs.The input that receives for example, can be used for calculating the result who is associated with the vision and the coordination ability of user, and integral body all can with the test that is used for each personal.
The vision that is used for tested object and the test set of the coordination ability are provided in one embodiment.Such test set can comprise and presents assembly, input module, and processing components, and wherein presenting assembly can present to object with the visual test of test, the test of for example visual tracking, distance focusing test and vision aiming test etc.Object can provide input to respond each test to test set.Input module can be set to accept input then, and processing components can be set to handle the input that receives.
The vision that is used for object and the method for testing of the coordination ability are provided in another embodiment, and wherein this method is carried out at single formula website.This method partly comprises implements two or more visual capacity tests to tested object; Reception from the input of tested object to respond each test; And handle from the input of tested object reception.
Totally with reference to accompanying drawing, please especially earlier with reference to Fig. 1, it shows the block diagram of exemplary computing system, and computing system is referred generally to be decided to be computing system 100, and it is made as the vision and the coordination ability that is provided for tested object.The technical staff in present technique field should be appreciated that and recognize that the computing system 100 shown in Fig. 1 only is an example that is suitable for computingasystem environment, but not is used to show the restriction about the scope of the use of embodiment of the present invention or performance.Computing system 100 also should not be construed as has any dependence or essential to single component shown in this Fig. 1 or combination of components.
Computing system 100 comprises through connecting 108 input equipment connected to one another 102, display device 120, data base 104, central site 106 and test cell 110.Connect 108 and can be wired (for example being cable) or wireless (for example being wireless network).Connecting 108 can also be network, and wherein this network can include but not limited to one or more LANs (LANs) and/or wide area network (WANs).Such network environment is prevalent in enterprise computer network, intranet and the Internet.Further, connect 108 and between the assembly of computing system 100, can comprise local wired connection (locally wired connection).Therefore, connect 108 in this no longer further explanation.
Input equipment 102 can receive the one or more responses from object.Input equipment 102 can also be any device that can receive response from object.The technical staff in present technique field will recognize the input equipment of in computing system 100, having used more than similar input equipment 102.Input equipment 102 for example can also be mike (microphone); Stick (joystick); Game mat (gamepad); Wireless device (wireless device); Keyboard (keyboard); Keypad (keypad); Game console (game controller); Treadmill (treadmill); Force plate (force plate); Eye tracking system (eye tracking system); Gesture identification system (gesture recognitionsystem); Touch screen (touch sensitive screen) and/or any other input start assemblies of the wired or wireless data that receive through network 108 is provided to test cell 110.Input equipment 102 can comprise voice recognition device and/or handle the software from the audition input of tested object.For example, in order to show the identification of visual indicia, can be the verbalization of the characteristic that has of visual indicia from the audition input of object.In one embodiment, if this characteristic is the differently-oriented directivity of Landolt (Landolt) " C ", the audition of response input can be " on ", D score, " right side " and " left side ".But the technical staff in present technique field will understand and recognize and can also use the visual indicia of other audition input (for example, statement color, numeral, letter, symbol etc.) with denoted object perception and/or identification.But it should be noted that the present invention is not limited in such input equipment 102, implement, in the protection domain of embodiment, it can also be implemented on arbitrary device of number of different types.Input equipment 102 can receive and catch the input of denoted object to the response of the visual indicia of demonstration.If this characteristic is the direction orientation, satisfied test response can be the direction of recognition visible sensation indicia face.Through but be not limited to such embodiment, identification can comprise that object is through being orientated the input that joystick provides on the corresponding direction with direction like the hand-held device of input equipment 102.
The output of the video that display device 120 can display object visually can be observed; And can be computer, the test set of arbitrary type or the TV monitor that comprises cathode ray tube, liquid crystal display, plasma screen or arbitrary other display type, perhaps can comprise from the front or the back screen of transmission image above that.Further, for the person that makes the test and management before visual capacity test is offered tested object, among and afterwards with test cell 110 interactions, display device 120 can provide the user interface.
If input equipment 102 is eye tracking systems, the position and/or the focus of eyes that can monitored object, when eye location and/or when focusing on suitable position, the record input.
If input equipment 102 is gesture identification systems, can uses multiple systems and/or method and receive input.For example, can use one or more photographing units and come the limbs of monitored object and/or brothers' motion, when object is made suitable posture, in conjunction with suitable hardware and/or software records input.The gesture identification system can also use the signal that is connected to object, so that motion tracking.Can also use the transmitter and the receptor that are connected to object, it can be used as the part of gesture identification system.
If input equipment 102 is touch screens, can use the touch screen of arbitrary type.In addition, can use the cover layer of the material of touch-sensitive and itself are received and touch input touching insensitive display.Such cover layer and display can be arbitrary distance.
Test cell 110 as shown in Figure 1 can be the accountant of arbitrary type, will be elaborated to its embodiment with reference to Fig. 4 and Fig. 5 below.Data base 104 can be set to store the information that is associated with the vision and the coordination ability.The technical staff in present technique field will understand and recognize that the information that stores among the data base 104 can be provided with, and it can comprise any information relevant with the test of the vision and the coordination ability.The content of such information and volume do not limit the scope of embodiment of the present invention with arbitrary mode.Although shown in the figure is single independently assembly, in fact, data base 104 can be a plurality of data bases, for example, and database cluster.Further, data base's 104 partly or entirely can be positioned on the accountant that is associated with test cell 110, another external computing device (not shown) and/or its arbitrary combination.The technical staff in present technique field should recognize that data base 104 is optional, need not combine computing system 100 to implement.
Refer again to Fig. 1, the embodiment that illustrates according to the present invention, test cell 110 can comprise and present assembly 112, input module 114, test suite 116 and processing components 118.The technical staff in present technique field will appreciate that the assembly 112,114,116 and 118 shown in Fig. 1 substantially with only be exemplary on the number, should not be construed as limitation of the present invention.In the scope of embodiment of the present invention, can use the assembly of arbitrary number and realize required function.
Present assembly 112 and can show output through the video of object visual observation; And can be computer, the test set of arbitrary type or the TV monitor that comprises cathode ray tube, liquid crystal display, plasma screen or arbitrary other display type, perhaps can comprise from the front or the back screen of transmission image above that.
In one embodiment, presenting device 112 can be that application setting is the mirror of the vision transmission of generation distance in limited spatial dimension and/or the device of lens (circumferential arrangement that for example, mirror is provided is to produce tunnel-effect (tunnel effect)).An example of such device is to use the perspective test set that mirror produces the perspective of distance.Such device can be included in the mirror that foveal region (just direct front at object) shows visual indicia, and it can also comprise the sideview mirror of demonstration visual indicia with test peripheral vision ability.
In another embodiment, this device can comprise the lens of size of the visual indicia of the distance that changes perception and/or demonstration, to realize mimic distance.As a result, such device can offer tested object and look the demonstration visual indicia nearer or far away than actual displayed.Therefore, such configuration has produced the perspective of optical infinity to tested object.
The technical staff in present technique field will recognize that presenting assembly 112 can comprise the multiple arrangement that combination shows some visual stimulus that specific activities has.In one embodiment, can use a plurality of demonstrations (for example, split screen (split-screen)) that single device shows visual indicia.
Present assembly 112 and can selectively comprise demonstration glasses, the protective eye lens worn by object, screen etc., the non-obviously visual visual display for other people is provided to object.In addition, present assembly 112 and can two dimension or 3-D view be provided to tested object.3-D view shows the virtual reality or holographic demonstration that can comprise object.
During operation, present assembly 112 and can also be set to present one or more visual indicias to tested object.Like following detailed description, assembly 112 can diversified mode present visual indicia, with the different aspect of the vision and the coordination ability of tested object.Usually, each visual indicia can have one or more characteristics.These characteristics for example are; The direction orientation (for example; Arrow, Landolt, E word visual acuity chart etc.), the position at user interface (for example; Be positioned at specific 1/4th districts of demonstration), the combination in any of one of the predetermined number of mutual exclusion characteristic (for example, towards upper and lower a, left side or one of right indication) or these characteristics.In addition, the technical staff in present technique field will understand and recognize the characteristic that can also use other, and the present invention is not limited to arbitrary special characteristic.
Input module 114 can be set to (for example, through using input equipment 102) and receive input from tested object.Any suitable receiving unit that can receive the input that is provided by object can be applied to the present invention.For instance, it is not a limitation of the present invention, and object can be used keyboard, stick, tracking ball etc. input is provided.Input can be decided according to presenting assembly.For example, be touch-sensitive if present assembly, object can present assembly through contact provides input.In another embodiment, input module can have voice recognition capability, and wherein object can provide input through the sounding response by input module identification.The technical staff in present technique field will understand and recognize that the input module of any appropriate can be applied among the present invention.According to presenting test that assembly appears and the above-mentioned ability that presents assembly, can optimize some type.After receiving input from object, input module 114 for example can store input in data base 104, is used for reference in the future.
Test suite 116 is set to object test is provided.To be elaborated with reference to Fig. 2 below, test suite 116 can provide two or more tests, to confirm the vision and the coordination ability of object.More specifically, can a plurality of tests be provided at the single formula website of for example test cell 110.In addition, the test that is arranged so that of test suite 116 can be according to object variation.For example, provide detailed movement or activity, competitive level, visual intensity/weak degree of tested object etc., test can change.Therefore, test suite 116 can also be responsible for confirming test (with the level or the difficulty of test) through presenting appearing of assembly 112.
The present invention provides processing components 118 to handle the input that receives from input module 114.As shown in Figure 3, processing components 118 can comprise subassembly 310, data collecting assembly 312, training developer component 314 and transfer assembly 316.Subassembly 310 can be set to adopt scoring method to obtain mark according to object to the response of the test that appears.The response of these responses with the specific population of typically from data base 104, retrieving compared, can confirm the response of object.Subassembly 310 when the one or more response that receives and measure visual indicia, the evaluation of the vision and the coordination ability of object can be provided.In case confirmed mark (for example, percentage point), can present to object through presenting assembly 112.Can finish in each test, finish or its combination presents mark when finishing in all tests.
Data collecting assembly 312 is set to collect the data that receive from input module 114.Such data for example can be stored among the data base 104.The data of collecting can be used to create the standard that is used for specific population further, and it can be got subassembly 310 and adopt.The technical staff in present technique field will recognize data base 104 and/or must can be positioned at other assembly place away from system 100 by subassembly 310.
Training developer component 314 is set to be used for according to data of collecting and the mark exploitation of confirming the training plan or the system of tested object.In embodiments of the present invention, test cell 110 can be used to train tested object after object is tested.
Transfer assembly 316 is set to the mark of confirming, the data of collection etc. are transferred to and presents assembly 112.Transfer assembly 316 can add to the for example external computing device of central site 106 data are provided, and is used for further consideration, analysis or storage.In one embodiment, transfer assembly 316 can provide data to test suite 116 in real time, makes test in test process, can be provided with or change.The technical staff in present technique field should understand and recognize; Although above-mentioned embodiment and the instance explained, transfer assembly 316 can provide the information that is associated with the testing vision and the coordination ability to arbitrary assembly of computing system 100, inside and outside arbitrary assembly of test cell 110.
The technical staff in present technique field will recognize that transfer assembly 316 can send arbitrary information that needs frequency of self-test unit 110.That is, for example, after object was accomplished all tests or selectively accomplished each single test, information can send to the place that needs.If information is sent to central site 106 or data base 104 stores and/or handles, information is concentrated and is sent to all objects the most at last.The transmission frequency can depend on the storage volume and the disposal ability of test cell 110, and the expectation of information is used.
Referring now to Fig. 2, it shows test suite 116 further.Test suite 116 can comprise depth perception coordination component 210, expection timing component 212, scan perception component 214 and speed perception component 216.Test cell 110 can be used each in these assemblies, to test the individual vision and the many aspects of the coordination ability.The technical staff in present technique field will recognize the test that can also use other, and it still belongs to protection scope of the present invention.
Depth perception assembly 210 is set to the depth perception of tested object, and it can be included in the different degree of depth and show visual indicia and require the tested object location should or to look the visual indicia that is in a certain degree of depth.In one embodiment, can present a plurality of visual indicias, except one, other each visual indicia looks and is in same depth.In such embodiment, tested object can be located the visual indicia that looks with other labelling different depth, and this response is input to test cell 110.The technical staff in present technique field will realize and understand, depth perception assembly 210 can be used any suitable test of depth perception that can tested object.
The ability that expection timing component 212 is set to test tested object is in the operating visual indicia time with the expection visual indicia.In one embodiment, the visual indicia of for example round dot or circle is presented to object, thereby this labelling looks and moves towards object.Then, when object expection visual indicia arrived ad-hoc location, object can provide input to show.The technical staff in present technique field will realize and understand that depth perception assembly 212 can also be used any suitable depth perception test.
Scan perception component 214 is set to test the ability of tested object to visual scanning.The present invention can use any suitable test, and it still belongs to protection scope of the present invention.For instance, it is not a limitation of the present invention, can visual indicia be offered tested object.Visual indicia can comprise the single visual indicia of the grid of specific pattern.For example, it is entities that the grid round dot can be shown as some round dots, and other round dot is non-entity.Then, the round dot of entity can be shown as the similar profile of other round dot with grid, and object need confirm that before these be the round dot of entity.Another exemplary scan perception test also is included in special time the numeral of random set is presented to object, lets object import the numeral of perception.
Speed perception component 216 is set to the speed of the visual indicia that tested object can perception.In one embodiment, in certain period of time, visual indicia is shown or flashes to tested object.Then, at the diverse location of display device, in the time period that changes, show another visual indicia.Between each visual indicia that flashes, can present neutral visual indicia at the center that shows.Let tested object discern each visual indicia that flashes, measure their vision and neural disposal ability, thus the specific speed of perception.The technical staff in present technique field will recognize the test that can use any appropriate that can the testing vision speed perception.
Referring now to Fig. 4, it shows according to exemplary vision of the present invention and the neural test macro 400 of handling.Through having the single formula test cell that for example can present the test cell 412 of a plurality of tests, the overall evaluation preferably of the vision and the neural disposal ability of object can be provided to object.Further, because test cell 412 can comprise disposal ability, it can deal with data, and the result has formed mark and/or the training regime confirmed that is used for object.Display device 414 can be to object 410 output visual stimulus.Object 410 can be used 416 pairs of visual stimulus of input equipment input is provided.The space shows that 418 can be selectively and/or additionally export visual stimulus.For example, as stated, the space shows that 418 can be used in combination with speed perception component 216, expection timing component 212 or its test suite.Certainly, the space shows that 418 can be used in combination with the visual test of other type.
Fig. 5 shows vision and the neural test macro 400 of handling according to the embodiment of the present invention further.More specifically, in this example, the visual stimulus of display device 414 output marked 421, bottom labelling 422, left side labelling 423 and the right labelling 424.The labelling that is presented at display device 414 can be static or dynamic.Object can be used the selection of the one or more labellings of input equipment 416 inputs.Mark selected for example depend on move, the standard of alignment, the degree of depth, color, size, difference or other visual characteristic.
Referring now to Fig. 6, it shows the flow chart 600 of the vision and the neural method of processing capacity of tested object.Although the term of using hereinafter " step " is the different key elements that expression this method adopts with " square frame "; This term should not be interpreted as any specific order that contains between each step disclosed by the invention, only and if except each step order offer some clarification on.During beginning, (the for example test cell 110 in the application drawing 1) imposes on tested object with two or more visions and/or neural processing test.This is presented in the square frame 610.The technical staff in present technique field will recognize, can implement above-mentioned arbitrary test and measure individual's vision and other test of neural disposal ability.The concrete test that applies according to settings such as the level of ability of object, competitive level, concrete activities and the order of test.When test was imposed on object, object can be through providing suitable response with the interaction that is connected to the input equipment of test cell by input module.This is presented in the square frame 620.Can use a plurality of input equipment, and receive a more than response from object.For example, in the test of taking sb's mind off sth, object can provide a response to the visual indicia test of eye-hand coordination, and as stated, object can provide another response to visual indicia in another position.Then, processing components (the for example processing components among Fig. 1 118) is handled for example through collecting data, confirming the input that mark or exploitation training regime receive.Data for example can be stored among the data base 104, perhaps for example send to central site 106 through transfer assembly.This is presented in the square frame 630.
Alternatively, at square frame 640, the data that in each test, receive from the input of object are used for confirming the mark of object.Can confirm a mark to each test, confirm gross score according to the data of all tests.Mark can depend on the related data of specific population further, and therefore the data of object can compare (percentage point that for example, can provide their level of object) with it.At square frame 650, with them the training regime to tested object is developed in the input of the response of visual capacity test according to for example their mark of confirming, to train the his or her vision and the coordination ability.
The present invention is illustrated according to concrete embodiment, and it only is exemplary, but not is used to limit the present invention.Other optional embodiment will be obvious to the technical staff in present technique field, and it does not break away from protection scope of the present invention.
In sum, can know that the present invention has realized above-mentioned purpose of illustrating and target well, and recognize other tangible and intrinsic advantage of system and method for the present invention.It will be appreciated that under the situation of not quoting further feature and inferior combination, some characteristic is still available with time combination.This belongs to protection scope of the present invention equally.

Claims (12)

1. device that is used for the vision and the neural disposal ability of tested object comprises:
Be set to appear the assembly of a plurality of visions and the test of neural disposal ability, wherein the test of vision and neural disposal ability comprises the vision and the neural test of handling of determination object, and wherein object provides the response input to each test;
Be set to receive the input module of the input that provides through object;
Be set to provide to object the test suite of test, wherein this test suite comprises:
Be set to the depth perception assembly of the depth perception of tested object;
Be set to test the ability of tested object, be in operating vision with the expection visual indicia
The expection timing component of labelling time;
Be set to test the scan perception component of tested object to the ability of visual scanning;
Be set to the assembly of the speed of the visual indicia that tested object can perception; And
Be set to handle the processing components of the input that receives, wherein this processing components comprises:
Be set to collect data collecting assembly from the data of input module reception;
Be set to confirm the fractional subassembly that gets according to the input that receives;
Be set to data, the definite mark collected are transferred to the transfer assembly that presents assembly;
Be set to be used for the training plan of tested object or the training developer component of system according to data of collecting and the mark exploitation of confirming.
2. device according to claim 1 is characterized in that the test of a plurality of visions and neural disposal ability comprises the depth perception test.
3. device according to claim 2 is characterized in that the depth perception test comprises first visual indicia of presenting to object, is positioned this first visual indicia through the input that the position that responds this first visual indicia is provided by object.
4. device according to claim 1 is characterized in that the test of a plurality of visions and neural disposal ability comprises the expection fixed time test.
5. device according to claim 4 is characterized in that said expection fixed time test comprises second visual indicia, and said second visual indicia is a Landolt.
6. device according to claim 1 is characterized in that the test of a plurality of visions and neural disposal ability comprises the perception velocities test.
7. device according to claim 1 is characterized in that the test of a plurality of visions and neural disposal ability comprises the perception sweep test.
8. the method for testing of the vision of a tested object and neural disposal ability, wherein this method is carried out at single formula website, and this method comprises:
Apply a plurality of visions and the test of neural disposal ability to tested object, wherein the test of vision and neural disposal ability is the test of the vision and the neural disposal ability of determination object;
Reception is from the input of each test of tested object response;
To object test is provided, wherein, said test comprises
The depth perception of tested object;
The ability of test tested object is in the operating visual indicia time with the expection visual indicia;
The test tested object is to the ability of visual scanning;
Tested object can perception the speed of visual indicia; And
The input that processing receives wherein comprises:
Collect the data of input;
Mark is confirmed in input according to receiving;
Transmission also presents the data of collection, definite mark;
The training plan or the system that are used for tested object according to data of collecting and the mark exploitation of confirming.
9. method according to claim 8 is characterized in that the test of a plurality of visions and neural disposal ability comprises the depth perception test.
10. method according to claim 9 is characterized in that the depth perception test comprises first visual indicia of presenting to object, is positioned this first visual indicia through the input that the position that responds this first visual indicia is provided by object.
11. method according to claim 8 is characterized in that the test of a plurality of visions and neural disposal ability comprises the expection fixed time test.
12. method according to claim 11 is characterized in that said expection fixed time test comprises second visual indicia, said second visual indicia is a Landolt.
CN2008800118947A 2007-04-13 2008-04-14 Unitary vision and neuro-processing testing center Expired - Fee Related CN101657143B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US92343407P 2007-04-13 2007-04-13
US60/923,434 2007-04-13
US94191507P 2007-06-04 2007-06-04
US60/941,915 2007-06-04
PCT/US2008/060249 WO2008128190A1 (en) 2007-04-13 2008-04-14 Unitary vision and neuro-processing testing center

Publications (2)

Publication Number Publication Date
CN101657143A CN101657143A (en) 2010-02-24
CN101657143B true CN101657143B (en) 2012-05-30

Family

ID=41711088

Family Applications (5)

Application Number Title Priority Date Filing Date
CN200880011994XA Expired - Fee Related CN101657146B (en) 2007-04-13 2008-04-14 Syetems and methods for testing and/or training near and farvisual abilities
CN2008800118947A Expired - Fee Related CN101657143B (en) 2007-04-13 2008-04-14 Unitary vision and neuro-processing testing center
CN200880011916XA Expired - Fee Related CN101657144B (en) 2007-04-13 2008-04-14 Unitary vision and neuro-processing testing center
CN200880011961.5A Expired - Fee Related CN101657846B (en) 2007-04-13 2008-04-14 The method and system of visual cognition and coordination testing and training
CN2008800119314A Expired - Fee Related CN101657145B (en) 2007-04-13 2008-04-14 Unitary vision testing center

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN200880011994XA Expired - Fee Related CN101657146B (en) 2007-04-13 2008-04-14 Syetems and methods for testing and/or training near and farvisual abilities

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN200880011916XA Expired - Fee Related CN101657144B (en) 2007-04-13 2008-04-14 Unitary vision and neuro-processing testing center
CN200880011961.5A Expired - Fee Related CN101657846B (en) 2007-04-13 2008-04-14 The method and system of visual cognition and coordination testing and training
CN2008800119314A Expired - Fee Related CN101657145B (en) 2007-04-13 2008-04-14 Unitary vision testing center

Country Status (1)

Country Link
CN (5) CN101657146B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6731850B2 (en) * 2013-09-02 2020-07-29 オキュスペクト オサケ ユキチュア Threshold inspection and determination
WO2015037089A1 (en) * 2013-09-11 2015-03-19 日立コンシューマエレクトロニクス株式会社 Brain dysfunction assessment method, brain dysfunction assessment device, and program thereof
SG11201602944YA (en) * 2013-10-17 2016-05-30 Childrens Healthcare Atlanta Inc Methods for assessing infant and child development via eye tracking
FR3014673A1 (en) * 2013-12-17 2015-06-19 Essilor Int APPARATUS AND METHOD FOR DETECTING VIEW FAULTS AND VISUAL ACUTE MEASUREMENT OF A USER
CN104970763A (en) * 2014-04-09 2015-10-14 冯保平 Full-automatic vision detecting training instrument
CN104382560A (en) * 2014-12-08 2015-03-04 丹阳市司徒镇合玉健身器械厂 Ataxia detector
CN104586403A (en) * 2015-01-21 2015-05-06 陕西省人民医院 Finger movement mode monitoring and analysis device and use method thereof
CN104851326A (en) * 2015-05-18 2015-08-19 吉首大学 Ideological and political work demonstration and teaching instrument
CN104887467A (en) * 2015-06-03 2015-09-09 侯跃双 Child vision correction recovery instrument
CN105496347B (en) * 2016-01-12 2017-06-06 哈尔滨学院 Depending on depth electronic measuring device
ES2702484T3 (en) 2016-01-15 2019-03-01 Centre Nat Rech Scient Device and procedure for determining eye movements by touch interface
EP3435840B1 (en) * 2016-03-31 2019-12-04 Koninklijke Philips N.V. Device, system and computer program product for detecting muscle seizure of a subject
CN106726388B (en) * 2017-01-04 2019-02-05 深圳市眼科医院 A kind of training device and its control method of extraocular muscle neural feedback muscle
CN107736889B (en) * 2017-09-08 2021-01-08 燕山大学 Detection method of human body coordination detection device
CN109727508B (en) * 2018-12-11 2021-11-23 中山大学中山眼科中心 Visual training method for improving visual ability based on dynamic brain fitness
CN109744994A (en) * 2019-03-12 2019-05-14 西安爱特眼动信息科技有限公司 A kind of perimetry device based on multihead display
CN109998491A (en) * 2019-04-25 2019-07-12 淮南师范学院 A kind of glasses and method of test depth perceptibility
WO2021086274A1 (en) * 2019-10-30 2021-05-06 Chulalongkorn University A stimulating system for collaborative functions of brain and body
CN113018124A (en) * 2021-03-02 2021-06-25 常州市第一人民医院 Rehabilitation device for unilateral neglect of patient
CN115969677B (en) * 2022-12-26 2023-12-08 广州视景医疗软件有限公司 Eyeball movement training device
CN116115981A (en) * 2023-02-09 2023-05-16 湖南理工学院 Table tennis player service action recognition training instrument
CN116172560B (en) * 2023-04-20 2023-08-29 浙江强脑科技有限公司 Reaction speed evaluation method for reaction force training, terminal equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825460A (en) * 1994-04-30 1998-10-20 Canon Kabushiki Kaisha Visual function measuring apparatus
US6364845B1 (en) * 1998-09-17 2002-04-02 University Of Rochester Methods for diagnosing visuospatial disorientation or assessing visuospatial orientation capacity
US6632174B1 (en) * 2000-07-06 2003-10-14 Cognifit Ltd (Naiot) Method and apparatus for testing and training cognitive ability

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4528989A (en) * 1982-10-29 1985-07-16 Weinblatt Lee S Screening method for monitoring physiological variables
US4618231A (en) * 1984-02-22 1986-10-21 The United States Of America As Represented By The Secretary Of The Air Force Accommodative amplitude and speed measuring instrument
US5088810A (en) * 1989-01-23 1992-02-18 Galanter Stephen M Vision training method and apparatus
CN1077873A (en) * 1992-04-22 1993-11-03 四川大学 Computerized comprehensive test system for visual sense
US5812239A (en) * 1996-10-22 1998-09-22 Eger; Jeffrey J. Method of and arrangement for the enhancement of vision and/or hand-eye coordination
US6092058A (en) * 1998-01-08 2000-07-18 The United States Of America As Represented By The Secretary Of The Army Automatic aiding of human cognitive functions with computerized displays
US6066105A (en) * 1998-04-15 2000-05-23 Guillen; Diego Reflex tester and method for measurement
US6454412B1 (en) * 2000-05-31 2002-09-24 Prio Corporation Display screen and vision tester apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825460A (en) * 1994-04-30 1998-10-20 Canon Kabushiki Kaisha Visual function measuring apparatus
US6364845B1 (en) * 1998-09-17 2002-04-02 University Of Rochester Methods for diagnosing visuospatial disorientation or assessing visuospatial orientation capacity
US6632174B1 (en) * 2000-07-06 2003-10-14 Cognifit Ltd (Naiot) Method and apparatus for testing and training cognitive ability

Also Published As

Publication number Publication date
CN101657145A (en) 2010-02-24
CN101657143A (en) 2010-02-24
CN101657146B (en) 2012-01-18
CN101657144A (en) 2010-02-24
CN101657146A (en) 2010-02-24
CN101657846B (en) 2016-03-09
CN101657145B (en) 2012-01-25
CN101657846A (en) 2010-02-24
CN101657144B (en) 2012-05-30

Similar Documents

Publication Publication Date Title
CN101657143B (en) Unitary vision and neuro-processing testing center
KR101520113B1 (en) Unitary vision and neuro-processing testing center
Perez-Marcos Virtual reality experiences, embodiment, videogames and their dimensions in neurorehabilitation
CN102573610B (en) Unified vision testing and/or training
EP2134245B1 (en) Unitary vision and coordination testing center
KR101726894B1 (en) Testing/training visual perception speed and/or span
CA2770113C (en) Multi-touch display and input for vision testing and training
Durgin et al. Palm boards are not action measures: An alternative to the two-systems theory of geographical slant perception
US20120108909A1 (en) Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
Iskander et al. Using biomechanics to investigate the effect of VR on eye vergence system
JP5654341B2 (en) Apparatus and method for examining visual and neural processing of a subject
US20170113095A1 (en) Device and method for restoring and developing hand functions
Straker et al. Children have less variable postures and muscle activities when using new electronic information technology compared with old paper-based information technology
Cidota et al. Assessing upper extremity motor dysfunction using an augmented reality game
Meusel Exploring mental effort and nausea via electrodermal activity within scenario-based tasks
Lai et al. Fun and accurate static balance training to enhance fall prevention ability of aged adults: A preliminary study
KR102550724B1 (en) Augmented reality based cognitive rehabilitation training system and method
Kim et al. Design and application of 2D illusory vibrotactile feedback for hand-held tablets
Ishihara et al. A pilot study on impact of viewing distance to task performance
Adams The integration of vision and touch for locating objects
Hughes et al. One step closer to achieving inclusive design: Design considerations for clients with low vision
Parit et al. Eye tracking based human computer interaction
Petrović et al. Visual impairment simulation for inclusive interface design
Khambadkar Leveraging Proprioception to create Assistive Technology for Users who are Blind
Confalonieri Systems development for diagnostics and dexterity rehabilitation by means of touchscreen technology

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: NIKE INNOVATION LIMITED PARTNERSHIP

Free format text: FORMER OWNER: NIKE INTERNATIONAL LTD.

Effective date: 20141117

C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20141117

Address after: oregon

Patentee after: NIKE INNOVATE C.V.

Address before: oregon

Patentee before: Nike International Ltd.

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120530

CF01 Termination of patent right due to non-payment of annual fee