WO2014150927A1 - Virtual property reporting for automatic structure detection - Google Patents

Virtual property reporting for automatic structure detection Download PDF

Info

Publication number
WO2014150927A1
WO2014150927A1 PCT/US2014/024567 US2014024567W WO2014150927A1 WO 2014150927 A1 WO2014150927 A1 WO 2014150927A1 US 2014024567 W US2014024567 W US 2014024567W WO 2014150927 A1 WO2014150927 A1 WO 2014150927A1
Authority
WO
WIPO (PCT)
Prior art keywords
target structure
processor
street
information
executable code
Prior art date
Application number
PCT/US2014/024567
Other languages
French (fr)
Inventor
Stephen L. Schultz
David Arthur Kennedy
James Smyth
Original Assignee
Pictometry International Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pictometry International Corp. filed Critical Pictometry International Corp.
Priority to EP14771085.9A priority Critical patent/EP2972953A4/en
Priority to CA2906448A priority patent/CA2906448C/en
Priority to MX2015012469A priority patent/MX355657B/en
Priority to AU2014235464A priority patent/AU2014235464B2/en
Publication of WO2014150927A1 publication Critical patent/WO2014150927A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/16Real estate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures

Definitions

  • the processor executable code when executed by the at least one processor may further cause the at least one processor to analyze a parcel database to determine an identity of neighboring parcels either adjacent to the target structure or within a predefined radius of the target structure, and to make available to a user information related to neighboring parcels when accessing information indicative of the target structure.
  • the processor executable code when executed by the at least one processor may further cause the at least one processor to identify ownership or residency of neighboring parcels and to store a link within the database to information indicative of the owners or residents of neighboring parcels.
  • the processor executable code when executed by the at least one processor may also cause the at least one processor to identify social media information of owners or residents of neighboring parcels and to store information within the database indicative of the social media information of the owners or residents of the neighboring parcels.
  • the geo-referenced imagery may be stored in a database 40 as one or more electronic files that may be rendered into a picture, an image, or a sketch.
  • the electronic files may be in any suitable image format (e.g., JPEG, BMP, TIFF, and/or the like).
  • FIG. 4 illustrates a flow chart 70 of an exemplary method for generally detecting one or more physical properties of a structure, such as the structure 102.
  • the first host system 12a and/or the second host system 12b may additionally manipulate the footprint of the structure 102. Such additional manipulation may be pre-assigned and/or user initiated. Automated or semi-automated algorithms may be used to make any of the above determinations and/or measurements, and a user or an administrator may review the determinations and/or measurements of the algorithms and accept or edit the measurement or determination, or direct the algorithm to start over, for example.
  • the foundation estimate report 200 may include data sets such as customer information 202, foundation information 204, estimated area detail 206, and contractor information 208.
  • the program logic 42 may include instructions to determine and report the orientation of a facet of the structure 102 relative to a street, by determining which facet of the structure 102 faces the street and/or is the front of the structure 102. For example, determining which facet faces the street may be carried out by determining a center of the structure 102, projecting a line connecting the center of the structure to a known location of the street address of the structure, and designating a facet positioned between the center and the street with which the line intersects as the street-facing facet. As described above, the program logic may determine the X, Y, and Z location of the corner of the structure 102.

Abstract

A computer system comprises a processor capable of executing processor executable code operably coupled with a non-transitory computer medium storing processor executable code, which when executed by the processor causes the processor to: (a) receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; (b) in response to receiving the first signal, access a database including information about the target structure; and (c) transmit a second signal over the computer network indicative of a virtual property report for the target structure including at least one image of the target structure. The virtual property report may include information about a facet of the target structure facing a street, location of a main entrance of the target structure, location of a secondary entrance of the target structure, location of vehicle access to the target structure.

Description

VIRTUAL PROPERTY REPORTING FOR AUTOMATIC STRUCTURE DETECTION
BACKGROUND
[0001] In the remote sensing/aerial imaging industry, imagery may be used to capture views of a geographic area in order to measure objects and/or structures within the images. These are generally referred to as "geo-referenced images" and come in two basic categories:
[0002] Vertical Imagery - images captured with a camera pointed vertically downward thus generally capturing the tops of structures; and
[0003] Oblique Imagery - images captured with a camera aimed at an angle capturing the sides, as well as, tops of structures.
[0004] Most vertical imagery may be processed in order to fit a mathematically rectangular projection or map. This process is known as ortho-rectification and attempts to create an appearance as if the sensor were directly above each pixel in the image. The resulting image is known as an ortho-photo. Since the images are mathematically projected, they may be combined into a seamless mosaic resulting in a composite image known as an ortho-mosaic. The term Ortho image' is used to denote a geo-referenced image that is either an ortho-photo image or an ortho- mosaic image.
[0005] Because they are captured looking straight down, an ortho-photo or ortho-mosaic contains a view of the world to which many are not accustomed. As a result, there may be difficulty in distinguishing between two different properties (e.g., buildings, structures, and/or other man-made or natural objects or features) as the only portions of the structures visible in the ortho-mosaic are rooftops. An oblique image, in contrast, is captured at an angle showing sides of objects and structures. Aerial imagery may be used in identification of dimensions of buildings or structures. Traditional ortho-rectified imagery has limited use, however, because it reveals only the edge of the roof and does not reveal several important aspects of the building.
SUMMARY
[0006] In one aspect, the inventive concepts disclosed herein are directed to a computer system comprising at least one processor capable of executing processor executable code operably coupled with a non-transitory computer medium storing processor executable code, which when executed by the at least one processor causes the at least one processor to: (a) receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; (b) in response to receiving the first signal, access a database including information about the target structure; and (c) transmit a second signal over the computer network, the second signal indicative of a virtual property report for the target structure including at least one image of the target structure.
[0007] The processor executable code when executed by the at least one processor may further cause the processor to determine using aerial imagery and at least one data set indicative of street files information about one or more of: a facet of the target structure facing a street, location of a main entrance of the target structure relative to the street, location of a secondary entrance of the target structure relative to the street, location of vehicle access to the target structure within the at least one image, and to store the information in the database and be associated with the target structure. The processor executable code for determining the information may be organized to be executed by the at least one processor prior to executing the instructions to receive the first signal. The processor executable code when executed by the at least one processor may further cause the processor to access the information indicative of the facet of the target structure facing the street, and retrieve and display an image of the facet within the virtual property report. The processor executable code when executed by the at least one processor may also cause the at least one processor to receive a selection of one or more pixels within a displayed image of the target structure in which the one or more pixels have pixel coordinates, transform the pixel coordinates into real-world geographic coordinates, measure distances between the real-world coordinates, and store the measurements on a non-transitory computer readable medium within the database and associated with the target structure, prior to the step of receiving the first signal over the computer network. The processor executable code when executed by the at least one processor may further cause the at least one processor to associate a label identifying the measurement with a particular measurement, and store the label with the measurement within the database and associated with the target structure. The label identifying the measurement may be a particular field within the database, and/or may be selected from the group including orientation and area of a driveway depicted within the displayed image, size or area or elevations of a deck depicted within the displayed image, location and height of trees adjacent to the target structure, areas of windows of the target structure, area of a vertical or pitched surface on the target structure, a height of an eave of the target structure, a height of a chimney of the target structure, a distance to a church from the target structure.
[0008] The processor executable code when executed by the at least one processor may further cause the at least one processor to analyze a parcel database to determine an identity of neighboring parcels either adjacent to the target structure or within a predefined radius of the target structure, and to make available to a user information related to neighboring parcels when accessing information indicative of the target structure. The processor executable code when executed by the at least one processor may further cause the at least one processor to identify ownership or residency of neighboring parcels and to store a link within the database to information indicative of the owners or residents of neighboring parcels. The processor executable code when executed by the at least one processor may also cause the at least one processor to identify social media information of owners or residents of neighboring parcels and to store information within the database indicative of the social media information of the owners or residents of the neighboring parcels.
[0009] In a further aspect, the inventive concepts disclosed herein are directed to a non-transitory computer readable medium storing computer executable instructions that when executed by a processor cause the one or more processor to determine data indicative of one or more predetermined features of a target structure displayed in an image, the one or more predetermined features selected from: a facet of the target structure facing a street, location of a main entrance of the target structure relative to the street, location of a secondary entrance of the target structure relative to the street, orientation and area of a driveway of the target structure, location of vehicle access to the target structure, size and area and elevations of a deck of the target structure, real-world geographic location and height of a tree depicted in the image with the target structure, a geographic location of a trunk of a tree depicted in the image with the target structure, an area of a window of the target structure, an area of siding (e.g., an area of a wall or portion of a wall) of the target structure depicted within multiple aerial images, a height of an eave of the target structure, height of a chimney of the target structure, a distance to one or more churches from the target structure, and/or social media information of a neighbor of the target structure. [0010] In a further aspect, the inventive concepts disclosed herein are directed to a computer system comprising at least one processor capable of executing processor executable code operably coupled with a non-transitory computer readable medium storing processor executable code, which when executed by the at least one processor causes the at least one processor to: (a) receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; (b) in response to receiving the first signal, access a database including multiple aerial images of the target structure; (c) automatically identify an aerial image from the multiple aerial images depicting a facet of the target structure that faces a street; and (d) transmit a second signal over the computer network, the signal indicative of the aerial image of the target structure depicting the facet of the target structure that faces the street. The processor executable code may further cause the processor to transmit a sequence of third signals depicting the multiple aerial images following the transmission of the second signal. The processor executable code may also cause the at least one processor to identify the facet of the target structure that faces the street by: (1) accessing a file identifying the street; (2) projecting a line through at least a portion of a facet of the target structure to the street; and (3) identifying the facet that the line is projected through as the facet of the target structure that faces the street. The processor executable code may further cause the at least one processor to identify a rear facet of the target structure as a facet that is positioned at about 180° relative to the facet of the target structure that faces the street.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0011] To assist those of ordinary skill in the relevant art in making and using the subject matter hereof, reference is made to the appended drawings, which are not intended to be drawn to scale, and in which like reference numerals are intended to refer to similar elements for consistency. For purposes of clarity, not every component may be labeled in every drawing.
[0012] FIG. 1 is a schematic diagram of hardware forming an exemplary embodiment of a computer system constructed in accordance with the present disclosure.
[0013] FIG. 2 is a block diagram of an embodiment of one or more host systems according to the instant disclosure. [0014] FIG. 3 is a block diagram of an embodiment of one or more memory according to the instant disclosure.
[0015] FIG. 4 is a flowchart of an exemplary method for determining one or more physical attributes and dimensions of a foundation of a structure in accordance with the present disclosure.
[0016] FIG. 5 is a pictorial representation of an image showing an exemplary process for detecting location of a structure.
[0017] FIGS. 6-8 are simplified pictorial representations showing an exemplary process for defining edges of a structure in accordance with the present disclosure.
[0018] FIG. 9 is an exemplary embodiment of a foundation estimate report presentation page according to the present disclosure.
[0019] FIG. 10 is an exemplary embodiment of a window replacement estimate report presentation page according to the present disclosure.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0020] Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the disclosure is not limited in its application to the details of construction, experiments, exemplary data, and/or the arrangement of the components set forth in the following description or illustrated in the drawings.
[0021] The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for purpose of description and should not be regarded as limiting.
[0022] The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements.
[0023] As used herein, the terms "comprises," "comprising," "includes," "including," "has," "having" or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. [0024] As used in the instant disclosure, the terms "provide", "providing", and variations thereof comprise displaying or providing for display of a webpage (e.g., structure detection webpage) to one or more user terminals (e.g., an access point) interfacing with a computer and/or computer network(s) and/or allowing the one or more user terminal(s) to participate, such as by interacting with one or more mechanisms on a webpage (e.g., structure detection webpage) by sending and/or receiving signals (e.g., digital, optical, and/or the like) via a computer network interface (e.g., Ethernet port, TCP/IP port, optical port, cable modem, a DSL modem, POTS modem, and combinations thereof). A user may be provided with a web page in a web browser, or in a software application, for example.
[0025] Further, unless expressly stated to the contrary, "or" refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
[0026] In addition, use of the "a" or "an" are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the inventive concept. This description should be read to include one or more and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0027] Further, use of the term "plurality" is meant to convey "more than one" unless expressly stated to the contrary.
[0028] As used herein any reference to "one embodiment," "an embodiment", "one example," or "an example" means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase "in one embodiment" or "one example" in various places in the specification are not necessarily all referring to the same embodiment or example.
[0029] Circuitry, as used herein, may be analog and/or digital, components, or one or more suitably programmed microprocessors and associated hardware and software, or hardwired logic. Also, "components" may perform one or more functions. The term "component," may include hardware, such as a processor, an application specific integrated circuit (ASIC), or a field programmable gate array (FPGA), or a combination of hardware and software. Software includes one or more computer executable instructions that when executed by one or more component cause the component to perform a specified function. It should be understood that the algorithms described herein are stored on one or more non-transient memory. Exemplary non-transitory memory includes random access memory, read only memory, flash memory or the like. Such non-transient memory may be electrically based or optically based, for example.
[0030] Generally, but not by way of limitation, the inventive concepts disclosed herein are directed to computer systems, databases, and methods for visual insight for a target location and/or property, stemming from a confluence of accumulated relevant data, visualizing imagery, maps, and measurement-based analysis. The target property or location may be any manmade or natural structure, building, or feature. Visual insight according to the inventive concepts disclosed herein may be provided to users in the form of virtual property reports including one or more images of the property, as will be described in detail herein, for example. Automated or semi- automated algorithms may be used to determine any measurements relating to the target location, and a user or an administrator may review the determination of the algorithms and accept or edit the determination, or direct the algorithm to start over, for example.
[0031] Referring now to the figures, and in particular to FIG. 1 , shown therein is an exemplary structure detection system 10 constructed in accordance with the present disclosure. System 10 may be a system or systems that are able to embody and/or execute the logic of the processes described herein. Logic embodied in the form of software instructions or firmware may be executed on any appropriate hardware. For example, logic embodied in the form of software instructions or firmware may be executed on dedicated system or systems, or on a personal computer system, or on a distributed processing computer system, or the like. In some embodiments, logic may be implemented in a stand-alone environment operating on a single computer system and/or logic may be implemented in a networked environment such as a distributed system using multiple computers and/or processors exchanging signals over a computer network and/or via one or more computer ports.
[0032] In some exemplary embodiments, system 10 may be distributed, and include one or more host systems 12 communicating with one or more user devices 14 via a network 16. As used herein, the terms "network-based," "cloud-based." and any variations thereof, are intended to include the provision of configurable computational resources on demand via interfacing with a computer and/or computer network, with software and/or data at least partially located on the computer and/or computer network, by pooling processing power of two or more networked processors.
[0033] In some exemplary embodiments, the network 16 may be the Internet and/or other network. For example, if the network 16 is the Internet, a primary user interface of system 10 may be delivered through a series of web pages. It should be noted that the primary user interface of the system 10 may be replaced by another type of interface, such as a Windows-based application (e.g., deploying the system 10 in a stand-alone environment such as a kiosk).
[0034] The network 16 may be almost any type of network. For example, in some embodiments, the network 16 may be an Internet and/or Internet 2 network (e.g., exist in a TCP/IP-based network). It is conceivable that in the near future, embodiments of the present disclosure may use more advanced networking topologies.
[0035] The one or more user devices 14 may include, but are not limited to implementation as a personal computer, a smart phone, network-capable television set, a television set-top box, a tablet, an e-book reader, a laptop computer, a desktop computer, a network-capable handheld device, a video game console, a server, a digital video recorder, a DVD-player, a Blu-Ray player, and combinations thereof, for example. In some embodiments, the user device 14 may include one or more input devices 18, one or more output devices 20, one or more processors (not shown) capable of interfacing with the network 16, processor executable code, and/or a web browser capable of accessing a website and/or communicating information and/or data over a network, such as the network 16. As will be understood by persons of ordinary skill in the art, the one or more user devices 14 may include one or more non-transitory computer memory comprising processor executable code and/or software applications, for example. Current embodiments of system 10 may also be modified to use any of these user devices 14 or future developed devices capable of communicating with the one or more host systems 12 via the network 16.
[0036] The one or more input devices 18 may be capable of receiving information input from a user and/or processor(s), and transmitting such information to the user device 14 and/or to the network 16. The one or more input devices 18 may include, but are not limited to, implementation as a keyboard, touchscreen, mouse, trackball, microphone, fingerprint reader, infrared port, slide-out keyboard, flip-out keyboard, cell phone, PDA, video game controller, remote control, fax machine, network interface, and combinations thereof, for example.
[0037] The one or more output devices 20 may be capable of outputting information in a form perceivable by a user and/or processor(s). For example, the one or more output devices 20 may include, but are not limited to, implementations as a computer monitor, a screen, a touchscreen, a speaker, a website, a television set, a smart phone, a PDA, a cell phone, a fax machine, a printer, a laptop computer, a web server, a network interface card or port, and combinations thereof, for example. It is to be understood that in some exemplary embodiments, the one or more input devices 18 and the one or more output devices 20 may be implemented as a single device, such as, for example, a touchscreen or a tablet. It is to be further understood that as used herein the term user is not limited to a human being, and may comprise, a computer, a server, a website, a processor, a network interface, a human, a user terminal, a virtual computer, and combinations thereof, for example.
[0038] The system 10 may include one or more host systems 12. For example, FIG. 1 illustrates system 10 having two host systems 12a and 12b although a single host system 12 may be included in system 10, or in the alternative, more than two host systems 12 may be included in system 10. In some embodiments, the host systems 12 may be partially or completely network-based or cloud based. The host system 2 may or may not be located in a single physical location. Additionally, multiple host systems 12 may or may not necessarily be located in a single physical location.
[0039] Each of the host systems 12 may be capable of interfacing and/or communicating with the one or more user devices 14 via the network 16. For example, the host systems 12 may be capable of interfacing by exchanging signals (e.g., analog, digital, optical, and/or the like) via one or more ports (e.g., physical ports or virtual ports) using a network protocol, for example. Additionally, each host system 12 may be capable of interfacing and/or communicating with other host systems directly and/or via the network 16, such as by exchanging signals (e.g., analog, digital, optical, and/or the like) via one or more ports.
[0040] For simplicity, the host system 12a may be referred to hereinafter as the "first host system" and the host system 12b may be referred to hereinafter as the "second host system." The network 16 may permit bi-directional communication of information and/or data between the first host system 12a, the second host system 12b, and/or user devices 14. The network 16 may interface with the first host system 12a, the second host system 12b, and the user devices 14 in a variety of ways. For example, the network 16 may interface by optical and/or electronic interfaces, and/or may use a plurality of network topographies and/or protocols including, but not limited to, Ethernet, TCP/IP, circuit switched paths, and/or combinations thereof. For example, in some embodiments, the network 16 may be implemented as the World Wide Web (or Internet), a local area network (LAN), a wide area network (WAN), a metropolitan network, a wireless network, a cellular network, a GSM-network, a CDMA network, a 3G network, a 4G network, a satellite network, a radio network, an optical network, a cable network, a public switched telephone network, an Ethernet network, and/or combinations thereof, for example. Additionally, the network 16 may use a variety of network protocols to permit bi-directional interface and/or communication of data and/or information between the first host system 12a, the second host system 12b, and/or one or more user devices 14.
[0041] Referring to FIGS. 1 and 2, in some embodiments, the first host system 12a may comprise one or more processors 30 working together, or independently to, execute processor executable code, one or more memories 32 capable of storing processor executable code, one or more input devices 34, and one or more output devices 36. Each element of the first host system 12a may be partially or completely network-based or cloud-based, and may or may not be located in a single physical location.
[0042] The one or more processors 30 may be implemented as a single or plurality of processors working together, or independently, to execute the logic as described herein. Exemplary embodiments of the one or more processors 30 may include, but are not limited to, a digital signal processor (DSP), a central processing unit (CPU), a field programmable gate array (FPGA), a microprocessor, a multi-core processor, and/or combinations thereof, for example. The one or more processors 30 may be capable of communicating with the one or more memories 32 via a path (e.g., data bus). The one or more processors 30 may be capable of communicating with the input devices 34 and/or the output devices 36.
[0043] The one or more processors 30 may be further capable of interfacing and/or communicating with the one or more user devices 14 via the network 16. For example, the one or more processors 30 may be capable of communicating via the network 16 by exchanging signals (e.g., analog, digital, optical, and/or the like) via one or more ports (e.g., physical or virtual ports) using a network protocol. It is to be understood, that in certain embodiments using more than one processor 30, the processors 30 may be located remotely from one another, located in the same location, or comprising a unitary multi-core processor. The one or more processors 30 may be capable of reading and/or executing processor executable code and/or capable of creating, manipulating, retrieving, altering, and/or storing data structures into one or more memories 32.
[0044] The one or more memories 32 may be a non-transitory computer memory capable of storing processor executable code. Additionally, the one or more memories 32 may be implemented as a non-transitory random access memory (RAM), a CD-ROM, a hard drive, a solid state drive, a flash drive, a memory card, a DVD-ROM, a floppy disk, an optical drive, and/or combinations thereof, for example.
[0045] In some embodiments, one or more memories 32 may be located in the same physical location as the first host system 12a, and/or the one or more memories 32 may be located remotely from the first host system 12a. For example, one or more memories 32 may be located remotely from the first host system 12a and communicate with the one or more processors 30 via the network 16. Additionally, when more than one memory 32 is used, a first memory 32 may be located in the same physical location as the one or more processors 30, and additional memories 32 may be located in a remote physical location from the one or more processors 30. It should be noted that the physical location(s) of the one or more memories 32 may be varied. Additionally, one or more of the memories 32 may be implemented as a "cloud memory" (i.e., one or more memories 32 may be partially or completely based on or accessed using the network 6).
[0046] The one or more input devices 34 may transmit data to the one or more processors 30 and may include, but are not limited to, implementations as a keyboard, a mouse, a touchscreen, a camera, a cellular phone, a tablet, a smart phone, a PDA, a microphone, a network adapter, and/or combination thereof, for example. The input devices 34 may be located in the same physical location as the one or more processors 30, or may be remotely located and/or partially or completely network-based. [0047] The one or more output devices 36 may transmit information from the one or more processors 30 to a user, such that the information may be perceived by the user. For example, the output devices 36 may include, but are not limited to, implementations as a server, a computer monitor, a cell phone, a tablet, a speaker, a website, a PDA, a fax, a printer, a projector, a laptop monitor, and/or combinations thereof, for example. The one or more output devices 36 may be physically located with the one or more processors 30, or may be located remotely from the one or more processors 30, and may be partially or completely network based (e.g., website). As described herein, the term "user" is not limited to a human, and may comprise a human, a computer, a host system, a smart phone, a tablet, and/or combinations thereof, for example.
[0048] Referring to FIGS. 1 -3, the one or more memories 32 may store processor executable code and/or information comprising one or more databases 40 and program logic 42. In some embodiments, the processor executable code may be stored as a data structure, such as a database and/or a data table in the one or more memories 32, for example.
[0049] The second host system 12b may be similar or substantially similar in design and concept as the first host system 12a as described herein. The first host system 12a may directly communicate with the second host system 12b and/or communicate via the network 16. Generally, the first host system 12a may include one or more processors 30 capable of executing a first set of processor executable code and the second host system 12b may include one or more processors 30 capable of executing a second set of processor executable code.
[0050] In some embodiments, the first host system 12a and the second host system 12b may be independently or cooperatively controlled by separate entities, or may be controlled by the same entity. For example, the first host system 12a may be controlled by a first company and the second host system 12b may be controlled by a second company distinct from the first company. For example, the first host system 12a may be controlled by an imaging company and the second host system 12b may be controlled by a building material supplier. The imaging company may be a separate entity from the building material supplier. Other entities may control either the first host system 12a and/or the second host system 12b including, but not limited to, building contractors, real estate agencies, weather agencies, community agencies, home maintenance companies (e.g., gardeners, housekeeping services, window washers, pool maintenance companies, cleaning companies, and/or the like), federal agencies, state agencies, municipal agencies, schools, religious organizations, sport and recreation agencies, insurance agencies, historical commissions, utility agencies (e.g., water, gas, electric, sewer, phone, cable, internet, and/or the like), commercial agencies (e.g., grocery stores, big box stores, malls, restaurants, gas/auto service stations, and/or the like), news agencies, travel agencies, mapping agencies, and/or the like.
[0051] In general, system 10 may be configured to display and navigate geo- referenced imagery, such as aerial oblique imagery or aerial orthogonal imagery, and/or maps, sketches, and two-dimensional or three-dimensional models (e.g., location-centric). The geo-referenced imagery may be represented by a pixel map and/or by a series of tiled pixel maps that when aggregated recreate an image pixel map. Alternatively, the oblique imagery may be applied to one or more maps (e.g., street or parcel) or two-dimensional or three-dimensional models of structure(s) 102 depicted within the one or more two or three-dimensional models, rather than being applied to an image pixel map, for example. The geo-referenced imagery may be stored in a database 40 as one or more electronic files that may be rendered into a picture, an image, or a sketch. The electronic files may be in any suitable image format (e.g., JPEG, BMP, TIFF, and/or the like).
[0052] The system 10 will be described by way of an example utilizing aerial geo-reference images as the geo-referenced imagery. However, it should be understood that system 10 may use other types of geo-referenced images and/or geo-referenced information, such as architectural images, sketches, street-view type images, terrestrial images, and combinations thereof, for example.
[0053] FIG. 4 illustrates a flow chart 70 of an exemplary method for generally detecting one or more physical properties of a structure, such as the structure 102.
[0054] In a step 72, a target address for a particular structure may be obtained (e.g., selection on an image, reverse geo-coding an address, performing a parcel database look-up). For example, in some embodiments, a user may input a target address of the property or structure (e.g., the structure 102) in the one of more user devices 14. The terms "property" and "structure" may be used interchangeably with one another and may include a structure 102, along with surrounding land, buildings, structures, or features. [0055] FIG. 4 provides an exemplary detailed method 70 for obtaining a footprint of a foundation of a structure 102 using system and method illustrated in FIGS. 1 and 2. System 10 may be described by way of example utilizing aerial geo- referenced images as the geo-referenced imagery. However, it should be understood that system 10 may use other types of geo-referenced images, such as architectural images. In some embodiments, the methods as described herein may be used in addition to systems and methods as described in U.S. application Serial No. 61/564,699, which is hereby incorporated by reference in its entirety.
[0056] In some embodiments, a user (e.g. customer) may register a user profile with the first host system 12a and/or the second host system 12b. The user profile may be created and/or stored. For example, the user may be prompted by the first host system 12a and/or the second host system 12b to provide login credentials (e.g., username and/or password). Login credentials may allow the first host system 12a and/or the second host system 12b to authenticate the user. The user profile may include information including, but not limited to, demographic information including, but not limited to, name, age, address, billing account information, username, password, behavioral information, experience, gender, and/or the like.
[0057] Referring to FIGS. 4 and 5, in a step 72, a target location may be selected for a building or a structure 102. For example, in some embodiments, a user may input a target location into one or more user devices 14 by clicking on an image, reverse geo-coding an address, performing a parcel database look-up, and/or the like. Alternatively, the system 10 may automatically provide a target location. For example, the first host system 12a and/or the second host system 12b may provide a target location for evaluation without user input.
[0058] In a step 74, the first host system 12a and/or the second host system 12b may select one or more images containing the target location. In some embodiments, oblique geo-referenced images may be obtained using oblique aerial imagery as described in U.S. Patent No. 7,787,659, U.S. Patent No. 7,873,238, U.S. Patent No. 7,424,133, and U.S. Patent No, 5,247,356, all of which are hereby expressly incorporated by reference in their entirety.
[0059] Further, geo-referenced images may be obtained using oblique terrestrial imagery. For example, in some embodiments, images may be obtained using oblique terrestrial imagery if the images are capable of being measured upon and/or determined to reveal physical attributes of a structure as described herein. [0060] Geo-referenced images may be images having stored geo-referenced parameters. For example, geo-referenced images and parameters, when combined with a ground plane, (as described in U.S. Patent No. 7,424,133) may provide a determination of pixel location in real world coordinates (e.g., latitude, longitude) in the geo-referenced image. Measurements may then be made by calculating a difference in pixel location between points of measurement. For example, for distance measurements, calculations may be determined using circle routes and/or across a terrain by tracing a route along a ground plane or between multiple points in a structure. Automated or semi-automated algorithms may be used to determine any of the measurements, and a user or an administrator may review the measurement of the algorithms and accept or edit the measurement, or direct the algorithm to start over, for example.
[0061] Referring to FIGS. 1 , 4, and 5, in a step 76, the first host system 12a and/or the second host system 12b may display an image of the target location on one or more user devices 14. For example, the target location in FIG. 5 includes the structure 102. In some embodiments, the first host system 12a and/or the second host system 12b may display the image of the target location on one or more output devices 36. In some embodiments, multiple geo-referenced images showing multiple facets of the structure 102 may be provided. For example, geo-referenced images from each cardinal direction (e.g., North, South, East, and West) may be provided for each structure 102.
[0062] In some embodiments, the first host system 12a and/or the second host system 12b may use an application (e.g., software application) to evaluate and/or select the one or more geo-referenced images including the target location and/or structure 102. Additionally, in some embodiments, the first host system 12a and/or the second host system 12b may use an application to evaluate and/or select additional geo-referenced images from other cardinal directions. For example, determination of a suitable geo-referenced image for the target location or structure 102 may be made by using methods and systems as described in U.S. Patent Application Serial No. 12/221 ,571 , which is hereby incorporated by reference in its entirety. Additionally, in some embodiments, the first host system 12a and/or the second host system 12b may identify one or more structures 102 in the target location through a building detection algorithm. [0063] For example, in a step 80, a user may select the target structure 102 on the image of the target location using one or more user devices 14 and/or input devices 34.
[0064] The first host system 12a and/or the second host system 12b may select a geo-referenced image (e.g., orthogonal image) displaying a roof or other physical attribute of the structure 102, as in step 82.
[0065] In a step 84, the first host system 12a and/or the second host system 12b may estimate the boundary of the structure 102 from a structure detection algorithm using systems and methods as described in U.S. patent application Serial No. 12/221 ,571 , which is hereby incorporated by reference in its entirety.
[0066] In a step 86, the first host system 12a and/or the second host system 12b may select one or more geo-referenced images (e.g., oblique images) including one or more facets of the structure 102. In some embodiments, at least one image for each cardinal direction may be provided for the structure 102. FIG. 6 illustrates a simplified view of one facet of the structure 102.
[0067] The first host system 12a and/or the second host system 12b may run one or more edge detection algorithms on the one or more geo-referenced images of the structure 102, as in step 88. The multiple edge detection algorithms may include an edge detection confidence rating. In a step 90, the first host system 12a and/or the second host system 12b may store in the one or memories 32 the geo- referenced information from the image(s), general information about the image(s), confidence value(s), and/or the like.
[0068] The first host system 12a and/or the second host system 12b may compare each edge detection confidence value from each edge detection algorithm, as in step 92. The first host system 12a and/or the second host system 12b may also classify each edge (e.g., "good," "likely," "unlikely," "discard") based on the likeliness to an actual edge of the structure 102, as in step 94.
[0069] In a step 96, the first host system 12a and/or the second host system 12b may classify each edge based on orientation. For example, edges may be classified by horizontal edges 104, vertical edges 106, sloped edges 108, and the like.
[0070] Referring to FIGS. 4-6, the first host system 12a and/or the second host system 12b may compare the detected edges to the boundary area, as in step 98. Detected edges may be limited to those within the boundary area, as in step 100. Additionally, the outermost horizontal edges and the outermost vertical edges may be determined. For example, FIG. 7 illustrates the outermost vertical edges 106 of FIG. 6.
[0071] Referring to FIG. 8, in a step 102, the first host system 12a and/or the second host system 12b may determine horizontal edges 104 that meet the outermost vertical edges 106. Alternatively, the determination may be for vertical edges 106 that meet the outermost horizontal edges 104.
[0072] The first host system 12a and/or the second host system 12b may further determine if multiple facets of the structure 102 exist within the image, as in steps 106- 14. In some embodiments, however, the first host system 12a and/or the second host system 12b may skip the determination of multiple facets and proceed to step 1 16. Determination of facets may be useful in defining property attributes such as windows, siding, eaves, and the like, as will be discussed in further detail herein. Referring to steps 106-1 14, in determining facets, the first host system 12a and/or the second host system 12b may extrapolate the horizontal edges 104, vertical edges 106, and/or sloped edges 108 of each facet of the structure 102, as in step 108. The first host system 12a and/or the second host system 12b may pair the vertical edges 106 and/or pair the horizontal edges 104 to determine the number of facets of the structure 102 within the boundary area 103 of the image, as in step 110. Property attributes (e.g., significant features) of the structure 102 may be detected, as in step 1 12. For example, using pairings of horizontal edges 104, vertical edges 106, and/or sloped edges 108, windows may be detected on the structure 102. Additional property attributes may include, but are not limited to, eaves, siding, chimney(s), and/or the like.
[0073] In a step 114, the first host system 12a and/or the second host system 12b may locate geometric vertices of each facet of the structure 102. In some embodiments, geometric vertices may be determined by an angle of intersection (Θ) between horizontal edges 104 and vertical edges 106. For example, an intersection may be approximately ninety degrees relative to each other as in FIG. 8.
[0074] In a step 1 16, the first host system 12a and/or the second host system 12b may determine ground location 1 10 for one or more facets of the structure 102. In a step 1 18, the first host system 12a and/or the second host system 12b may select horizontal edges 104 that connect to the vertical edges 106 parallel to a roofline at the lowest point within the boundary area 103. In a step 120, the first host system 12a and/or the second host system 12b may compared the ground location 1 10 with the lowest selected horizontal edge 104 connecting the vertical edge 106 as determined in step 1 18. If the horizontal edge 104 is determined to be the lowest edge at the ground location 1 10 or substantially close to the ground location 1 10, the horizontal edge 104 may be tagged as a portion of the footprint of the structure 102, as in step 124. Alternatively, the first host system 12a and/or the second host system 12b may connect a horizontal line between vertical edges 106 at the ground location 1 10. For example, the first host system 12a and/or the second host system 12b may connect a horizontal line between vertical edges 106 at the ground location 1 10 in a position parallel to a roofline of the structure 102, as in step 126. The first host system 12a and/or the second host system 12b may continue to find each horizontal edge 104 at the ground location 1 10 for the structure as indicated by steps 128 and 130. The first host system 12a and/or the second host system 12b may continue to process each image of the structure 102 as indicated by steps 132 and 134.
[0075] Once all horizontal edges 104 of the structure 102 are determined, the first host system 12a and/or the second host system 12b may use the horizontal edges 104 at the ground location 1 10 to create, provide, and/or store a footprint of the structure 102 as indicated by step 136. The first host system 12a and/or the second host system 12b may then determine the footprint of the foundation of the structure 102 based on the footprint of the structure 102. For example, in some embodiments, the footprint of the structure 102 will be the footprint of the foundation of the structure 102. Alternatively, the footprint of the foundation may include additional portions outside of the range of the footprint of the structure 102 (e.g., additional porch or deck square footage). As such, the first host system 12a and/or the second host system 12b may additionally manipulate the footprint of the structure 102. Such additional manipulation may be pre-assigned and/or user initiated. Automated or semi-automated algorithms may be used to make any of the above determinations and/or measurements, and a user or an administrator may review the determinations and/or measurements of the algorithms and accept or edit the measurement or determination, or direct the algorithm to start over, for example.
[0076] While an automated procedure is described herein with reference to FIG. 4, it should be understood that semi-automated procedures may be used for determining measurements, sketches, and/or three-dimensional models. For example, the first host system 12a, the second host system 12b, the one or more input devices 18, and/or the one or more user devices 14, may be included with computer readable instructions stored on one or more non-transient memory that when executed by the one or more processors permit the user to select one or more pixels within one or more displayed images indicative of a location of structure(s), vertices, and/or other displayed features. The first host system 12a and/or the second host system 2b, the one or more input devices 18, and/or the one or more user devices 14 may then calculate a three-dimensional location for the one or more selected pixels. Further, various types of photogrammetry may be used to determine a three-dimensional location of the one or more selected pixels. For example, stereo photogrammetry, single image photogrammetry, and/or the like may be used.
[0077] In some exemplary embodiments, the program logic 42 may cause the first host system 12a and the second host system 12b to cooperate with one another to aggregate geo-referenced data and/or to geo-reference and then aggregate data in the database 40 to be used for the various reports as will be described below. Multiple measurements may be collected and stored in the database 40, to be provided to users in the form of a virtual property report provided on a computer screen as a physical property report provided as an electronic file and/or a copy or a printout, for example.
[0078] In some embodiments, the program logic 42 may cause the first host system 12a and/or the second host system 12b to aggregate geo-referenced data and/or to geo-reference and then aggregate data in the database 40, and to use such aggregated data for a variety of property reports which may include various compilations of fact-based information aggregated from multiple sources and presented co-incident with geo-referenced metric oblique imagery, for example. The virtual property reports may provide for extensive user interaction and may allow users to determine what metric analysis is included in the report, either when the report is requested, or at any time while viewing the virtual property report, and combinations thereof. For instance, when looking at a property to be insured, the inspector may wish to know how far the structure is from the nearest adjacent structure. The program logic 42 may include an automatic or a semi-automatic (user assisted) method or tools for determining this distance with imagery and/or other data available to the program and recording the facts and including the facts in the subsequently prepared virtual property report(s). Automated or semi-automated algorithms may be used to make any of the determinations and/or measurements described herein, and a user or an administrator may review the determinations and/or measurements of the algorithms and accept or edit the measurement or determination, or direct the algorithm to start over, for example.
[0079] As an example, a potential customer may request a cost estimate from a cleaning company to wash the windows of a structure 102. To provide an accurate estimate, the cleaning company may wish to determine how many windows are to be washed, the area of the windows, the height of the windows above the ground, access to the windows (e.g., is there any landscaping or shrubbery that would interfere with the window washing operation), can a ladder be used to reach upper level windows, are there any bay windows or recessed dormer windows, are there insect screens, etc. To that end, the cleaning company may request a window report for a structure 102, which may include the above information about the structure 102, along with any other pertinent information to enable the cleaning company to provide an accurate estimate without physical visitation of the structure 102, for example. Further, the cleaning company may benefit from an additional insight regarding the structure owner's ability to pay, for example, what is the general affluence of the area where the structure 102 is located, what is the assessed tax value of the structure 102, where do the owners work, and what is their work and/or work title (e.g., from social media information), etc. This information may be provided as a virtual or physical property report according to the inventive concepts disclosed herein.
[0080] In one exemplary embodiment, virtual or physical property reports according to the inventive concepts disclosed herein may help users determine and document changes that have occurred over time with respect to a particular structure 102, property, or a general area such as a city, a park, a neighborhood, a housing division, a business or office park, etc. The program logic 42 may include processor executable instruction to visually and analytically compare a target structure 102, property, or location at two periods of time. For example, a virtual property report according to the inventive concepts disclosed herein may include a dual-pane or a multi-pane viewer to synchronize viewing the same structure 102 or one or more facets thereof from two different dates. The dual-pane viewer (e.g., a web page or a printout) may include user tools to mark changes, measure differences (area of structure 102 or area of turf grass, etc.). In some embodiments, users may manually note the differences, for example. The host system 12a and/or 12b may record a user's analysis and may insert such analysis into a virtual property report that documents the work, the sources of information and the results, for example. The virtual property report may include information such as: how much less green space is there now than before, how much taller is the oak tree now than before, did the neighbor have the in-ground pool back in 2005, was that back deck on the house in 2007, and any other measurement and/or property related question that can be determined from geo-referenced imagery, or from other data sources, for example, that is compiled into a database.
[0081] Referring to FIGS. 1 , 4 and 9, in some embodiments, a user may be able to receive a foundation estimate report 200 using the system 10. For example, the user may request an estimate for work on the foundation of the structure 102 using one or more user devices 14. In some embodiments, a user may request an estimate for foundation work using one or more input devices 34 of the first host system 12a and/or the second host system 12b. The first host system 12a and/or the second host system 12b may use the method as detailed in FIG. 4 to create, provide, and/or store the footprint of the foundation and/or the dimensions of the foundation for the structure 102. In some embodiments, a user may be able to manipulate results of the method. For example, the user may be able to verify results of the footprint, add additional portions and/or details, and/or remove portions from consideration.
[0082] The first host system 12a and/or the second host system 12b may provide a report to the user for the foundation work to be performed on the structure 102. For example, FIG. 9 illustrates an exemplary foundation estimate report 200. The foundation estimate report 200 may be distributed using the first host system 12a and/or the second host system 12b to the one or more user devices 14 and/or input devices 34. In some embodiments, the report 200 may be distributed to a third party system in addition to, or in lieu of, the user. For example, if the user is a homeowner, the report 200 may be distributed to the customer and/or to a third party system such as a material supplier, insurance company, real estate agency, home service company, cleaning company, auditing company, contractors, or the like. As used herein "agency" is intended to include an individual, a group of individuals, a commercial or charity organization or enterprise, a legal entity (e.g., a corporation), an organization whether governmental or private, and combinations thereof. [0083] The foundation estimate report 200 may include data sets such as customer information 202, foundation information 204, estimated area detail 206, and contractor information 208.
[0084] The customer information data set 202 may include the customer name, customer contact information, and/or the like. The foundation information data set 204 may include one or more images of the structure 102. The estimated area detail data set 206 may include the total estimated square footage of the foundation as determined using the system 10 as described herein.
[0085] The contractor data set 208 may include one or more contractor names and/or associated contractor contact information. For example, the contractor data set 208 may comprise information about building contractors within a given geographic location. Each contractor may be associated with a contractor profile having information including, but not limited to, business name, contractor owner name, address, experience level, specialties performed, insurance coverage, age of contractor business, review, or ranking information, and/or the like. For example, the contractor data set 208 may include review information. The review or ranking information may include positive and/or negative feedback relating to each contractor. For example, the review or ranking information may be based on prior customer feedback of customers using the system 10. Review information may also be obtained from one or more outside databases (e.g., Yelp, Google review, and/or the like).
[0086] In some embodiments, contractors may self-register information with the first host system 12a and/or the second host system 12b. For example, a contractor may set-up a contractor profile within the first host system 12a and/or the second host system 12b. The contractor profile may have information including, but not limited to, business name, contractor owner name, address, experience level, age of contractor business, review information, and/or the like.
[0087] In some embodiments, additional data sets may be included within the foundation estimate report 200. For example, data sets may include, but are not limited to, weather data, insurance/valuation data, census data, school district data, real estate data, and/or the like.
[0088] Weather data sets may be created, provided, and/or stored by one or more databases storing information associated with weather (e.g., inclement weather). A weather data set within the foundation estimate report 200 may include, but is not limited to, hail history information and/or location, wind data, severe thunderstorm data, hurricane data, tornado data, flooding data, and/or the like. In some embodiments, the one or more databases including weather information may be hosted by a separate system (e.g., LiveHailMap.com) and contribute information to the first host system 12a and/or the second host system 12b. In some embodiments, the separate system (e.g., LiveHailMap.com) may be one or the first host system 12a or the second host system 12b. The weather data set may be included within the foundation estimate report 200 and provided to the user and/or other parties.
[0089] Insurance and/or valuation data sets may be created, provided, and/or stored by one or more databases storing information associated with property insurance and/or valuation. An insurance and/or valuation data set may include, but is not limited to, insured value of the home, insurance premium amounts, type of residence (e.g., multi-family, single family), number of floors (e.g., multi-floor, single- floor) building type, location relative to recognized hazard zones (wind, hail, flood, etc.) , eligibility for special insurance coverage (flood, storm surge), and/or the like. The location relative to recognized hazard zones may be measured using a walk- the-Earth feature along ground planes as described in U.S. Patents Nos. 7,424,133 and 8,233,666, for example. Automated or semi-automated algorithms may be used to make any of the determinations and/or measurements, and a user or an administrator may review the determinations and/or measurements of the algorithms and accept or edit the measurement or determination, or direct the algorithm to start over, for example.
[0090] In some embodiments, the one or more databases may be hosted by a separate system (e.g., Bluebook, MSB, 360Value), and contribute information to the first host system 12a and/or the second host system 12b. In some embodiments, the one or more databases may be included in the first host system 12a and/or the second host system 12b.
[0091] The insurance and/or valuation data set(s) may be included within the foundation estimate report 200 and provided to the user and/or other parties. For example, during underwriting of a home, an insurance company may request the foundation estimate report 200 on a home that is recently purchased. The information within the foundation estimate report 200 may be integrated with insurance information from an insurance database and used to form a quote report.' The insurance and/or valuation data may be sent to the user and/or to the insurance company as part of the foundation estimate report 200, or separately, for example. Alternatively, the report 200 may be solely sent to the insurance company with the insurance company using the information to formulate an insurance quote.
[0092] In another example, the report 200 may be used in an insurance claim. In the case of a property damage and/or loss of a customer, one or more databases may be used to create, provide, and/or store an insurance dataset with claim information in the report 200. For example, an insurance database having a policy in force (PIF) and a weather database may be used to correlate information regarding an insurance claim for a particular roof. This information may be provided within the report 200.
[0093] Real estate and/or census data sets may also be included within the one or more of the reports described herein, such as the report 200. The real estate and/or census data sets may be created and stored in one or more databases having detailed information of the structure 102. For example, a real estate data set may include, but is not limited to, the homeowner's name, the purchase price of the home, the number of times the home has been on the market, the number of days the home has been on the market, the lot size, the rental history, number of bedrooms, number of bathrooms, fireplaces, swimming pools, hot tubs, and/or the like.
[0094] Real estate assessment information may also be included within the real estate data set. Real estate assessment information may include, but is not limited to, one or more tools for locating comparable properties to the structure 102, tax assessments of similar valued properties, old photos of the structure 102 and/or adjacent areas, history of the property title (e.g., abstract), prior sales, prior owners, rental history, and/or the like.
[0095] The census data set may include information regarding the number of residents within the home or structure 102, names and addresses of neighborhood or community residents and/or companies, ages of the surrounding population, gender of the surrounding population, occupations of the surrounding population, income level of the surrounding population, public personas, social media information of neighbors (e.g., social media identities of neighbors, social media friends, followers, or connections of neighbors, or other social media profile information of neighbors, such as pets, favorite music and movies, social organization membership, hobbies, favorite sports and sport teams, relationship status, recent social media posts or status updates, work, profession, work title, etc.), presence of convicted felons, sex offender identification, and/or the like. In some embodiments, the one or more databases 40 may be hosted by a separate system (e.g., Core Logic) and contribute information to the first host system 12a and/or the second host system 12b to provide data sets as described herein. Alternatively, the one or more databases 40 may be integrated within the first host system 12a or the second host system 12b.
[0096] In some embodiments, the first host system 12a and/or the second host system 12b may include logic and/or computer executable instructions to determine and/or store one or more physical attributes of the structure 102. Physical attributes of the structure 102 may include, but are not limited to, the number and/or dimensions or areas of windows of the structure 102, the number and/or dimensions of doors, the amount and/or percentage of building materials (e.g., siding, stone, and/or the like), the dimension of attributes associated with the structure 102 (e.g., height of eaves, height and/or dimension of chimney(s), and/or the like), and/or the like. Physical attributes may include any attribute that may be measured or estimated or calculated by measuring distances between the X, Y, and Z locations of the geometric vertices and/or selected points by analyzing an aerial or terrestrial image with computerized methodologies as described herein. Automated or semi- automated algorithms may be used to make any of the determinations and/or measurements described herein, and a user or an administrator may review the determinations and/or measurements of the algorithms and accept or edit the measurement or determination, or direct the algorithm to start over, for example.
[0097] In some embodiments, the first host system 12a and/or the second host system 12b may determine the number and/or dimensions or area of windows of the structure 102. For example, referring to FIGS. 1 and 4, in step 108 the first host system 12a and/or the second host system 12b may extrapolate horizontal edges 104, vertical edges 106, and/or sloped edges 108 for each facet of a structure 102. Horizontal edges 104 may be paired, vertical edges 106 may be paired, and/or sloped edges 108 may be paired as in step 1 10. Pairing of edges may include an overview of significant features (e.g., physical attributes), such as windows of the structure 102. The first host system 12a and/or the second host system 12b may then locate geometric vertices of the horizontal edges 104, vertical edges 106, and/or sloped edges 108, to outline one or more windows of the structure 102. Dimensions of the one or more windows may be determined by measuring distances between the X, Y, and Z locations of the geometric vertices and/or selected points by analyzing the image data as discussed above, and/or within a separate image. Area of the windows may be determined from the dimensions of the windows, for example. The X, Y, and Z locations may be determined with the techniques described in U.S. patent No. 7,424,133, for example.
[0098] Referring to FIGS. 1 , 4 and 10, a user may be able to receive a window replacement estimate report 300 using the system 10. For example, the user may request an estimate for window replacement of the structure 102 using one or more user devices 14. In some embodiments, a user may request an estimate for window replacement using one or more input devices 34 of the first host system 12a and/or the second host system 12b. The first host system 12a and/or the second host system 12b may use the methods as detailed in FIG. 4 to create, provide, and/or store the number and/or dimensions of the windows for the structure 102. In some embodiments, a user may be able to manipulate results of the method. For example, the user may be able to use the user device 14 to select the location of the geometric vertices of the windows to verify windows, remove windows, and/or add windows for consideration.
[0099] The first host system 12a and/or the second host system 12b may provide the report 300 to the user for the window estimation. For example, FIG. 10 illustrates an exemplary window replacement estimate report 300. The window replacement estimate report 300 may be distributed using the first host system 12a and/or the second host system 12b to the one or more user devices 14 and/or input devices 34. In some embodiments, the report 300 may be distributed to a third party system in addition to, or in lieu of, the user. For example, if the user is a homeowner, the report 300 may be distributed to the customer and/or to a third party system such as a material supplier, insurance company, real estate agency, home service company, cleaning company, auditing company, contractors, or the like.
[00100] The window replacement estimate report 300 may include data sets such as customer information 302, structure information 304, estimated number and/or area detail 306, and contractor information 308.
[00101] The customer information data set 302 may include the customer name, customer contact information, and/or the like. The structure information data set 304 may include one or more images of the structure 102. The estimated number and/or area detail data set 306 may provide the total estimated number and/or dimensions of one or more window of the structure as determined using the system 10 as described herein. The contractor data set 308 may include one or more contractor names and/or associated contractor contact information.
[00102] In some embodiments, additional data sets may be included within the window replacement estimate report 300. For example, data sets may include, but are not limited to, weather data, insurance/valuation data, census data, school district data, real estate data, and/or the like as described herein. For example, in some embodiments, the window replacement estimate report 300 may be used in an insurance claim. In the case of property damage or loss of a customer, the first host system 12a and/or the second host system 12b may be used to create, provide, and/or store an insurance dataset with claim information in the window replacement estimate report 300. For example, an insurance database having a policy in force (PIF) and a weather database may be used to correlate information regarding an insurance claim for a particular structure 102 with windows. This information may be provided within the window replacement estimate report 300.
[00103] In some embodiments, system 10 may be used to generate a siding replacement estimate report. The siding replacement estimate report may be similar to the foundation estimate report 200 and the window replacement estimate report 300 described herein. In preparing the siding replacement estimate report, the first host system 12a and/or the second host system 12b may determine the dimensions of siding used on the exterior of the structure 102 including but not limited to dimensions and/or areas of distinct sections of the exterior of the structure 102, and cumulative area of different distinct sections. Referring to FIGS. 1 and 4, in step 108 the first host system 12a and/or the second host system 12b may extrapolate horizontal edges 104, vertical edges 106, and/or sloped edges 108 for each distinct facet of the structure 102. Horizontal edges 104 may be paired, vertical edges 106 may be paired, and/or sloped edges 108 may be paired as in step 1 10. Pairing of edges may reveal an overview of significant features (e.g., physical attributes), such as the dimensions and/or amount of siding on the exterior of the structure 102. The first host system 12a and/or the second host system 12b may then locate geometric vertices of the horizontal edges 104, vertical edges 106, and/or sloped edges 108 to outline the dimensions of the siding of the structure 102. Dimensions of the siding may be determined using analysis of the dimensions of one or more walls of the structure 102 within the image, and/or within one or more separate images showing different parts of the structure 102. In some embodiments the area of a wall of the structure 102 may be substantially the same as the area of the siding, while in some exemplary embodiment, the area of the siding may be smaller than the area of the wall of the structure 102.
[00104] In some embodiments, a user may be able to receive a siding replacement estimate report similar to the foundation estimate report 200 and/or the window replacement estimate report 300 of FIGS. 9 and 10. The siding replacement estimate report may include data sets, including, but not limited to, customer information, structure information, estimate details including type, area, and price of siding, contractor information, and/or the like. In some embodiments, additional data sets may be included within the siding replacement estimate report. For example, data sets may include, but are not limited to, weather data, insurance/valuation data, census data, school district data, real estate data, and/or the like as described herein.
[00105] In some embodiments, system 10 may be used to generate a roofing report. The roofing report may be similar to the foundation estimate report 200 and the window replacement estimate report 300 described herein.
[00106] Referring to FIGS. 1 and 4, the first host system 12a and/or the second host system 12b may determine the height of eaves of the structure 102. For example, in step 108 the first host system 12a and/or the second host system 12b may extrapolate horizontal edges 104, vertical edges 106, and/or sloped edges 108 for each facet of a structure 102. The first host system 12a and/or the second host system 12b may then determine horizontal edges 104 located at the peak of the structure 102 using the methods described in FIG. 4, to determine the location and dimensions of the eaves of the structure 102. In some embodiments, eaves location may be determine using the systems and methods as described in U.S. Patent Application Serial No. 12/909,692, which is hereby incorporated by reference in its entirety. The height of the eaves of the structure 102 may then be determined by approximating the distance from a ground level. For example, the height of the eaves of the structure 102 may be determined by using a single image and a ground plane as described in U.S. Patents Nos. 7,424,133 and 8,233,666, which are hereby incorporated by reference in their entirety. Other techniques, such as aero triangulation using overlapping images, may also be used.
[00107] In addition to, or in lieu of the height of the eaves, the pitch of the roof may be included within a report. In some embodiments, the pitch of the roof may be determined using systems and methods as described in U.S. Serial No. 13/424,054, which is hereby incorporated by reference in its entirety.
[00108] In some embodiments, the system 10 may be used to generate a chimney report. The chimney report may be similar to the foundation estimate report 200 and the window replacement estimate report 300 described herein.
[00109] Generally, the first host system 12a and/or the second host system 12b may determine the number and dimensions (e.g., height, width, length, and area) of one or more chimneys of the structure 102. For example, referring to FIGS. 1 and 4, in step 108 the first host system 12a and/or the second host system 12b may extrapolate horizontal edges 104, vertical edges 106, and/or sloped edges 108 for each facet of a structure 102. The first host system 12a and/or the second host system 12b may then determine pairings of vertical edges 106 rising towards and past the peak of the structure 102 using the methods described in FIG. 4, to determine the location and dimensions of the one or more chimneys of the structure 102. The number and dimensions of the one or more chimneys may be provided in a report similar to the foundation estimate report 200 and/or the window replacement report 300. For example, a report may be provided for a customer requesting cleaning of one or more chimneys, demolition of one or more chimneys, and/or the like. Alternatively, a report (including any of the reports described herein) may be provided to a property tax assessor, insurance agency, real estate agency, and/or the like, informing of the number and dimensions of one or more chimneys of the structure 102.
[00110] The methods as described and illustrated in FIG. 4 create, provide and/or store an outline the structure 102 preferably including one or more physical attributes of the structure 102 (e.g., windows, doors, chimneys, siding, eaves, and/or the like). As such, a sketch of the structure 102 may be provided in some embodiments. The sketch may be used as an image of the structure 102 within many industries including, but not limited to, building contractor companies, real estate agencies, weather agencies, community agencies, home maintenance companies (e.g., gardeners, maid services, window washers, pool maintenance companies, and/or the like), federal agencies, state agencies, municipal agencies, schools, religious agencies, sport and recreation organizations, insurance agencies, historical commissions, utility agencies (e.g., water, gas, electric, sewer, phone, cable, internet, and/or the like), commercial agencies (e.g., grocery stores, big box stores, malls, restaurants, gas/auto service stations, and/or the like), news agencies, travel agencies, mapping agencies, and/or the like.
[00111] In some embodiments, the sketch may include a three-dimensional model. The three-dimensional model may be the basis of a virtual property model containing not only information about the exterior of the structure 102, but may also include further information including, but not limited to, room layout, living areas, roof layout, and/or the like. Such information may be included in the real estate arena and/or building arena. For example, multiple data sources may be attached to the model to include bids for remodeling and/or the like. Some examples may be found in European Application No. 99123877.5, which is hereby incorporated by reference in its entirety.
[00112] In some embodiments, the sketch, three-dimensional model, and/or additional information may be used to acquire or to apply for any building permits and/or variances (e.g., zoning variances, fence building permits, water well digging permits, septic system construction permits, pool construction permits, etc.), for example. The sketch or model of the structure may be transmitted or otherwise provided to a permit-issuing agency as a part of a permit request (e.g., as a permit request report or request), to allow the permit-issuing agency to process the permit request without having personnel go out to the physical location of the structure 102, or by allowing personnel to defer or delay a site visit, or to avoid a phone call or substantially obviate the need for the permit requesters to submit additional information, for example. In some exemplary embodiments, the permit requester may annotate an image or sketch of the structure to show requested changes or structures to be built or removed, geo-referenced location of visible or underground utilities, compliance with zoning codes, fire codes, geo-referenced property boundaries, and any other relevant annotations to enable the permit-issuing agency to evaluate the permit request without physical visitation to the structure 102, for example. As will be appreciated by persons of ordinary skill in the art, an annotated permit or variance application according to the inventive concepts disclosed herein includes substantially precise geo-referenced locations of proposed features to be added to or removed from the property and/or the structure 102. Such geo- referenced location may show the actual location of the features, thus enabling the permit board, variance board, or zoning board members to consider and decide on the application by virtually visiting the structure 102, and without physically visiting the structure 102 as is the current practice of such boards. In this example, the sketch, three-dimensional model, or the like, can be prepared by a first user of the system and then transmitted to a second user of the system. The first user can be the permit requester, for example, and the second user can be a person(s) working at the permit-issuing agency.
[00113] In some embodiments, for example within the real estate industry, the sketch and/or three-dimensional model of the structure 102 may be used as a virtual tour for prospective buyers. The models may be obtained using the techniques as described herein. Additional information and/or photographs of the interior of the structure 102 may be used to model the interior of the structure 102 and used to create, provide, and/or store a virtual tour of the structure 102.
[00114] The methods, as described and illustrated in FIG. 4, may also apply to areas about the structure 102. For example, property about the structure 102 may be analyzed using the system 10 and techniques described herein. Such property attributes and/or land attributes may include, but are not limited to, size (e.g., acreage) of the property, dimensions of the property, easements and/or rights of way on the property, flood zone of the property, proximity to major roadways, proximity to railways, and/or the like. Alternatively, property or structure 102 attributes may be determined using a third party source and/or stored within the first host system 12a and/or the second host system 12b. Such property attributes may be added into information by the methods described herein. For example, acreage of property associated with the structure 102 may be added into one or more reports as described herein. In another example, the sketch may include locations of the structure 102 in relation to property attributes (e.g., easements, rights of way, dimensions of the property, flood zones, and/or the like).
[00115] Information regarding the structure 102 and/or the property, as described herein, may be used to create, provide, and/or store information and/or reports within the home maintenance arena. For example, the methods and systems as described herein may provide information regarding the dimensions of one or more areas of turf on the property of the structure 102 (e.g., front yard, back yard, side yard, mulch beds, flower beds, gardens, and the like), the dimensions of one or more gardens, the area and/or dimensions of one or more trees, the shade pattern of one or more trees, the actual X, Y, Z location of where a tree trunk intersects the ground, the height of a tree, the shade pattern of a tree (e.g., displayed as darkened lines and/or areas), the dimensions and area of one or more driveways, the number and area of windows, the dimensions of one or more gutters, the dimensions and/or depth of one or more pools (or height of above-ground pools), the dimensions of one or more decks (e.g., whether the decks are single-level, or multi-level, and the height, dimensions, and area of each level, including the height and length of deck railing), and/or the like. Such attributes may be generated by receiving a selection of one or more pixels in a displayed image and transforming the pixel coordinate(s) within the image to real-world geographic coordinates, and then further associating data such as labels or the like to the real world coordinates. The attributes may be added into information or virtual property reports described herein and/or provided in a single report using the methods described herein. For example, a report may be generated by a landscaping agency to include an estimate for lawn or landscaping maintenance. A first user (e.g., homeowner) may request a lawn maintenance estimate using user devices 14, and a second user of the landscaping agency may (1) create, (2) provide, and/or (3) store the measurement data in a database with a geographic indicator, such as an address, and then (4) generate a report using one of the user devices 14 and/or the first host system 12a and/or the second host system 12b as described herein. Such report may be generated without physical visitation of personnel of the landscaping agency to the structure 102. In addition, the measurement data may be addressed and/or accessed by a user using a program, such as a database program or a spreadsheet program. In addition, the measurement data can be saved as a digital worksheet (e.g., Microsoft Excel) file or an XML file, for example.
[00116] Construction information regarding the structure 102 and/or property may also be included in one or more reports, such as a virtual property report as described herein. For example, construction information may include, but is not limited to, size and slope of the roof, size and slope of the driveway, bearing of the driveway, shade patterns from adjacent structures and/or trees, and/or the like. Such information may be obtained using the methods as described in FIG. 4. For example, shade patterns of structures and/or trees may include lines within an image (e.g., darkened areas). The height of trees and the location where the tree trunk intersects the ground may be determined by suitable computational measurement technique, or correlating pixel coordinates to real-world coordinates, and may be included in the report. The footprint of the area may be determined by the darkened areas of the image. As such, the shade pattern may be created, provided, and/or stored using the first host system 12a and/or the second host system 12b as described herein. Shade information, in some embodiments, may include estimations for solar energy orientation and/or potentials. For example, a solar energy estimation report may be created, provided, and/or stored using the footprint of the area of the shade pattern. The report may be similar to the footprint estimation report 200 and/or the window replacement estimate report 300.
[00117] In some embodiments, information obtained by using the methods as described herein may be used by municipalities, users within the municipality, and/or users seeking information regarding structures 102 and/or land within the municipality. For example, using the methods of FIG. 4, boundaries (e.g., footprints) may be determined for counties, schools, cities, parks, churches (location and denominations), trails, public transit lines, and/or the like. Additionally, three dimensional models and/or sketches may be created, provided, and/or stored for such counties, schools, cities, parks, trails, youth sports facilities, public swimming pools, libraries, hospitals, law enforcement, hospitals, fire stations, and/or the like. In some embodiments, the models and/or sketches may provide viewings of structures 102 and/or physical attributes of land surrounding the properties. Also, distance between structures 102 may be estimated using techniques as described herein and may be measured along ground planes as described in U.S. Patents Nos. 7,424,133 and 8,233,666, for example. As such, location and/or distances between municipal facilities may be provided to a user.
[00118] In one example, school or school zone boundaries may be determined using modeling of structures (e.g., 2-dimensional and/or 3-dimensional) within the municipality. Modeling of structures may include an estimation of dimensions, number of structures 102, and/or the like. This information may be used to determine feasibility of altering school boundaries, feasibility in adding one or more schools and/or school systems, reporting of existing school boundaries, and/or the like.
[00119] In another example, zoning ordinances may be determined and/or enforced using structure modeling. For example, a homeowner may build a specific structure (e.g., pool), and report or file with the city certain dimensions subject to the zoning ordinances of the municipality. Using the systems and methods as described herein, the municipality may be able to view the pool and measure dimensions of the pool (e.g., size, orientation, or depth for aboveground pools) for compliance without physical visitation to the structure 102.
[00120] In some embodiments, municipal rendering may include mapping for special districts within a municipality. For example, obtaining footprints, models, and/or sketches of structures 102 may aid in grouping of structures 102 for fire and public protection. Using the methods as described herein, measurements may be included for each structure 102 and land surrounding the structures 102 such that distance measurements may be obtained and used to create, provide, and/or store mapping for such special districts (e.g., fire, public protection, congressional districts, electoral districts, and the like). Distance measurements may be determined along ground planes as described in U.S. Patents Nos. 7,424,133 and 8,233,666, for example.
[00121] In one example, an election report may be generated using mapping of the municipality. The election report may include information including, but not limited to, officials within one or more districts (e.g., local, county, state and/or federal officials) demographics of the one or more districts, distance of a point of interest to the one or more districts, location of municipal facilities, polling locations, distance to polling locations from a point of interest, images of polling locations, and/or the like. The election report may be similar to the foundation estimate report 200 and the window replacement report 300.
[00122] In one example, mapping structures 102 and land using the methods described herein may be used to create and/or electronically store in one or more databases 40 one or more models for sewer, lighting, and/or other public works. For example, mapping structures 102 and land may be used to create, provide, and/or store a three-dimensional model with associated dimensions. The three-dimensional model with associated dimensions may be used to plan for additional sewage lines and/or replacement/removal of sewage lines, in one example. With mapping, an estimation regarding feasibility, building materials, and/or the like may be created, provided, and/or stored prior to physical visitation to the area (e.g., structure and/or land). [00123] Information regarding utility connections and services may also be used to create, provide, and/or store a utility report using the systems and methods as described herein. For example, utility connections may be a physical attribute of a structure 102 identified using methods described herein. As such, the location of visible utility connections may be made. Additionally, the location of hidden utility services may be identified using structures and land analysis in conjunction within one or more databases having hidden utility location information. In some embodiments, hidden utility services may be determined solely using identification of physical attributes on the land and/or structures. For example, using the system and methods as described herein, physical attributes including, but not limited to, water shut off valves, sewer shut off valves, utility lines, meters, manhole covers, drains, overhanging lines, poles, and/or the like, may be identified and/or measured. From these physical attributes, hidden utility lines may be identified using the image and techniques described herein, including, but not limited to, gas, electric, sewer, water, phone, cable, internet, and/or the like. In some embodiment, additional information about the utility lines may be identified and stored in the database 40 by modeling and/or use of one or more outside databases. For example, the grade, size, capacity, and/or consumption of utility lines may be determined by the systems and methods as described herein, and/or included by one or more databases 40.
[00124] Using the methods as described in FIG. 4, for example, distance between structures 102 and/or land may be determined. Using this information, distance between structures 102 and/or land may be used in applications including, but not limited to, driving and/or commute time between places of interest, distances between and/or to retail environments (e.g., grocery stores, big box stores, mails, restaurants, gas/auto service stations), distance to hospitals, distance and/or time for fire and/or emergency response, distance between schools (e.g., zoning, boundaries), distance between churches (e.g., by denomination), and the like. Such distances may be provided to the user using the user device 14.
[00125] In one example, an emergency service report may be created and provided to a user. The emergency service report may include, but is not limited to, distance from a point of interest to one or more hospitals, distance for fire, ambulance, and/or police response to a point of interest, the type of law enforcement within a geographic area surrounding the point of interest, and/or the like. The emergency service report may be incorporated into any of the previous reports described herein or transmitted or provided as a standalone report. For example, the emergency service report may be similar to the foundation estimate report 200 and the window replacement report 300.
[00126] In one example, distances between churches (e.g., by denomination) may be measured and provided to a user. For example, system 10 may provide a church report including, but not limited to, locations of one or more churches, denominations of the one or more churches, distance from a point of interest to the one or more churches (e.g., user's residence), zonal boundaries of churches (e.g., parish boundaries), photos of the one or more churches, and/or the like. The church report may be similar to the foundation estimate report 200 and the window replacement estimate report 300.
[00127] Similarly, system 10 may provide a school report similar to the church report. The school report may include, but is not limited to, locations of one or more schools, age range of the one or more schools, ratings of the one or more schools, photos of the one or more schools, zonal boundaries of the one or more schools, and/or the like. Similar reports may be provided for parks, sports centers, youth sports practice facilities (e.g., soccer, baseball, football, basketball), youth sports associations and location of facilities, swimming pools (e.g., private and/or public), and/or the like.
[00128] In one embodiment, one or more databases may include information regarding registry of sex offenders, including addresses of sex offenders. This information may be used in conjunction with the distance between structures and/or land to provide distance measurements from a place of interest to one or more addresses of one or more sex offenders. For example, a place of interest may be a proposed or existing school. The distance between the school and the one or more addresses of one or more sex offenders may be provided to a user in a sex offender report. The sex offender report may include, but is not limited to, one or more pictures or one or more sex offenders within a geographic area, addresses of the one or more sex offenders, criminal history of the one or more sex offenders, distance between the place of interest and the one or more sex offenders, sketches of one or more structures inhabited by the one or more sex offenders and/or the like. In some embodiments, information such as community crime statistics, criminal record checks of neighbors, and/or the like, may be included within the sex offender report, a separate report, a report described herein, and/or any other similar reporting.
[00129] In some embodiments, using the system and methods as described herein, traffic flow may be determined and/or improved. In one example, a road may be mapped and/or modeled using the system and methods described herein. Traffic count may be made either physically on-site and/or using the system and methods as described herein. Using the traffic count and modeling of the road, traffic flow may be determined, altered, and/or improved. For example, for improvement, the model may be used to further include a base for developing additional routes (e.g., roads) in the area to ease traffic congestion. Modeling structures and land surrounding the road may further include additional information regarding development of additional routes.
[00130] In some embodiments, the systems and methods as described herein may be used for a property tax assessment. For example, using the methods as described in FIG. 4, physical attributes of the structure 102, and the structure 102 itself, may be determined, including distances, areas, and dimensions. A sketch and/or model may be determined for the structure 102 as well. The measurements and sketch may be used in a property tax assessment report similar to the foundation estimate report 200 and the window replacement estimate report 300. The property tax assessment report may include the sketch and/or model of the structure 102 and/or property, physical attributes of the structure 102 and/or property (e.g., grade, condition, dimensions, and/or the like), details of the assessed value of the structure 102 and/or property, current and/or historic values of the property (e.g., based on taxes, market value, and/or the like), additional imagery included by the tax assessor, a tax map, and/or other similar features.
[00131] In some embodiments, the system and methods as described herein may be used in formulation of an insurance report. The insurance report may be similar to the foundation estimate report 200 and the window replacement estimate report 300. The insurance report may include risks of weather hazards, maps of weather hazard risks, insurance ratings of structures and/or land, prior claim history for structures 102 and/or land, information from the emergency service report and/or the utility report discussed above, and/or the like. Using the methods as described in FIG. 4, distances of hazards and/or to hazards may be determined in some embodiments. Additionally, the square footage of the entire structure 102 and/or portions of the structure 102 may be determined. Based on the square footage, replacement cost estimates may be generated for replacement by total square footage or living area and included within the insurance report, for example.
[00132] In some embodiments, using the methods as described in FIG. 4, for example, the program logic 42 may include instructions to determine and report the orientation of a facet of the structure 102 relative to a street, by determining which facet of the structure 102 faces the street and/or is the front of the structure 102. For example, determining which facet faces the street may be carried out by determining a center of the structure 102, projecting a line connecting the center of the structure to a known location of the street address of the structure, and designating a facet positioned between the center and the street with which the line intersects as the street-facing facet. As described above, the program logic may determine the X, Y, and Z location of the corner of the structure 102. The center of the structure 102 may be determined by averaging the X coordinates of each corner and the Y coordinates of each corner of the structure 102. The known location on the street where the street address of the structure 102 is located may be obtained from a separate database, such as the T.I.G.E.R. (Topological^ Integrated Geographic Encoding and Referencing) database, for example.
[00133] In an exemplary embodiment, determining which facet of the structure 102 faces the street may be carried out by selecting a facet that is substantially parallel or angled at a predetermined angle relative to the street. The predetermined angle may be substantially perpendicular to the street, or any other desired angle. An automated or semi-automated algorithm may be used to determine the front of the structure, and a user or an administrator may review the determination of the algorithm and accept or edit the determination, or direct the algorithm to start over, for example. Information identifying the front of the structure 102 can be used for a variety of different transformative or analysis applications. For example, the image of the determined front facet of the structure 102 may be retrieved and displayed as the initial or default view of a virtual property report according to the inventive concepts disclosed herein, including the location of the main or front entrance to the structure, for example. Further, a street access to the structure 102 may be identified for law- enforcement or emergency services accessing the database 40, to include a main entrance for personnel, a back or side entrance, a driveway, alleyway, or street access for vehicles, and combinations thereof, for example. The front of the structure 102 and/or information about vehicle or personnel access to the property and/or structure 102 may be stored in the database 40, and may be provided as part of a virtual property report, for example. Similarly, the back of the structure 102 may be determined to be the facet that is offset by about 180° from the determined front of the structure 02 as will be appreciated by persons of ordinary skill in the art.
[00134] The determined front and back facets of the structure 102 may be included in a variety of virtual property reports according to the inventive concepts disclosed herein, such as an emergency response report showing the street access for vehicles to the structure 102 and/or the property surrounding the structure (e.g., where the street address location of the structure 102 does not correspond to the location of a driveway or entryway into the structure 102), the location of a main personnel entrance to the structure 102, the location of a secondary or back entrance to the structure 102, the location of nearby fire hydrants, overhanging power or utility lines, etc., to enable emergency personnel to determine the best way to access the structure and respond to an emergency, such as medical emergencies, police emergencies, fires, floods, earthquakes, gas leaks, tornados, and other emergency situations, for example.
[00135] Using the methods as described in FIG. 4, for example, the program logic 42 may also include instructions to determine and report changes to structures 102 and/or land. The program logic 42 may include instructions, that when executed by the first and/or second host systems 12a and 12b cause the first and/or second host systems 12a and 12b to store the following information in the database(s) 40: the location of one or more structures 102 (such as by address, or latitude and longitude, for example) measurements of the structures 102 and/or the land, identities of the measurements (such as height to eave, area of roof, length, width, or area of front flower bed, etc.) as well as the date in which the measurement(s) are taken. This can be accomplished by the database 40 having one or more tables with predetermined fields (e.g., area of roof field, height to eave field, length of front flower bed field, length of rear flower bed field, number of windows field, etc.) to store the measurements. For example, the structure 102 can be detected and/or measured on a first date, e.g., January 1 , 201 1 , and data indicative of the measurement type and date stored in the database 40. The same structure 102 can be detected and/or measured on a second date, e.g., January 1 , 2012, and the measurements, measurement type, and date stored. The system may detect differences in the size, location, dimensions, or appearance of any objects or structures between the two times, and combinations thereof, for example. The first and/or the second host systems 12a and 12b can be programmed to periodically, randomly, and/or with user activation, check for any differences in the measurements of the structure 102. If differences in measurements above a predefined amount (e.g., 0%, 5% or the like) are determined, a change report identifying the structure(s) 102 and the changes can be prepared and output by the first and/or second host systems 12a and 12b to one or more of the user devices 14 as a web page, form and/or a computer file.
[00136] The system 10 may be used by humans operating the user devices 14 to view and measure an object, such as the structure 102 and/or the land, in an oblique and/or ortho-image using image display and analysis software, as described for example in U.S. Patent No. 7,424,133; and U.S. Patent Application No. 2012- 0101783. The program logic 42 can include instructions for storing the structure measurement information into the database(s) 40 with or without the users knowing that the measurements are being stored. The structure measurement information may include the following: the location of one or more structures 102 (such as by address, or latitude and longitude, for example) measurements of the structures 102 and/or the land, measurement type (such as height to eave, area of roof, length of front flower bed, etc.) as well as the date in which the measurement(s) are taken. This can be accomplished by providing a web page to the user's user device 14 requesting the user to assign a measurement type to the measurement. For example, a field on a web page or form can be provided with a predetermined list of measurement types such as area of roof, height to eave, length of lot, or the like and a button on the web page or form can be used by the user to indicate that a measurement type has been selected. Alternatively, the image and the measurement can be automatically provided by the first and/or the second host systems 12a and 12b to another user device 14 to be viewed by another user, who assigns and stores a measurement type to the measurement using the user device 14. The first and/or the second host systems 12a and 12b receive and store the measurement and measurement type in the database(s) 40. Automated or semi- automated algorithms may be used to make any of the determinations and/or measurements described herein, and a user or an administrator may review the determinations and/or measurements of the algorithms and accept or edit the measurement or determination, or direct the algorithm to start over, for example.
[00137] It should be noted that the inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. For example, several embodiments include one or more data sets. Each data set may be included solely in that report, or be included in one more other reports as described herein. For example, the change report can be combined with the roof report, or the assessment report to provide the user with information regarding the current and historical measurements of the structure 102. Additionally, data sets within each report may stand-alone. For example, the real estate data set may be a stand-alone report not associated with other reports as described herein. It should also be noted that although "reports" are generated herein, such information need not be distributed in the form of a report. For example, the system 10 may determine there are 3 chimneys and distribute this information solely without inclusion within a "report."
[00138] From the above description, it is clear that the inventive concepts disclosed and claimed herein are well adapted to carry out the objects and to attain the advantages mentioned herein, as well as those inherent therein. While exemplary embodiments of the inventive concepts have been described for purposes of this disclosure, it will be understood that numerous changes may be made which will readily suggest themselves to those skilled in the art and which are accomplished within the broad scope and spirit of the inventive concepts disclosed herein and/or as defined in the appended claims.

Claims

1. A computer system comprising at least one processor capable of executing processor executable code operably coupled with a non-transitory computer medium storing processor executable code, which when executed by the at least one processor causes the at least one processor to:
receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; in response to receiving the first signal, access a database including information about the target structure; and
transmit a second signal over the computer network, the second signal indicative of a virtual property report for the target structure including at least one image of the target structure.
2. The computer system of claim 1 , wherein the processor executable code when executed by the at least one processor causes the processor to determine using aerial imagery and at least one data set indicative of street files information about one or more of: a facet of the target structure facing a street, location of a main entrance of the target structure relative to the street, location of a secondary entrance of the target structure relative to the street, location of vehicle access to the target structure within the at least one image, and store the information in the database and associated with the target structure.
3. The computer system of claim 2, wherein the processor executable code for determining the information is organized to be executed by the at least one processor prior to executing the instructions to receive the first signal.
4. The computer system of claim 2, wherein the processor executable code when executed by the at least one processor causes the processor to access the information indicative of the facet of the target structure facing the street, and retrieve and display an image of the facet within the virtual property report.
5. The computer system of claim 1 , wherein prior to the step of receiving the first signal over the computer network, the processor executable code when executed by the at least one processor causes the at least one processor to receive a selection of one or more pixels within a displayed image of the target structure in which the one or more pixels have pixel coordinates, transform the pixel coordinates into real-world geographic coordinates, measure distances between the real-world coordinates, and store the measurements on a non-transitory computer readable medium within the database and associated with the target structure.
6. The computer system of claim 5, wherein the processor executable code when executed by the at least one processor causes the at least one processor to associate a label identifying the measurement with a particular measurement, and store the label with the measurement within the database and associated with the target structure.
7. The computer system of claim 6, wherein the label identifying the measurement is a particular field within the database.
8. The computer system of claim 6, wherein the label is selected from the group including orientation and area of a driveway depicted within the displayed image, size or area of a deck depicted within the displayed image, location and height of trees adjacent to the target structure, areas of windows of the target structure, area of a vertical or pitched surface on the target structure, a height of an eave of the target structure, a height of a chimney of the target structure, a distance to a church from the target structure.
9. The computer system of claim 1 , wherein the processor executable code when executed by the at least one processor causes the at least one processor to analyze a parcel database to determine an identity of neighboring parcels either adjacent to the target structure or within a predefined radius of the target structure, and to make available to a user information related to neighboring parcels when accessing information indicative of the target structure.
10. The computer system of claim 9, wherein the processor executable code when executed by the at least one processor causes the at least one processor to identify ownership or residency of neighboring parcels and to store a link within the database to information indicative of the owners or residents of neighboring parcels.
1 1. The computer system of claim 9, wherein the processor executable code when executed by the at least one processor causes the at least one processor to identify social media information of owners or residents of neighboring parcels and to store information within the database indicative of the social media information of the owners or residents of the neighboring parcels.
12. A non-transitory computer readable medium storing computer executable instructions that when executed by a processor cause the one or more processor to determine data indicative of one or more predetermined features of a target structure displayed in an image, the one or more predetermined features selected from: a facet of the target structure facing a street, location of a main entrance of the target structure relative to the street, location of a secondary entrance of the target structure relative to the street, orientation and area of a driveway of the target structure, location of vehicle access to the target structure, size and area of a deck of the target structure, real-world geographic location and height of a tree depicted in the image with the target structure, a geographic location of a trunk of a tree depicted in the image with the target structure, an area of a window of the target structure, an area of siding of the target structure depicted within multiple aerial images, a height of an eave of the target structure, height of a chimney of the target structure, a distance to one or more churches from the target structure, and social media information of a neighbor of the target structure.
13. A computer system comprising at least one processor capable of executing processor executable code operably coupled with a non-transitory computer medium storing processor executable code, which when executed by the at least one processor causes the at least one processor to:
receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; in response to receiving the first signal, access a database including multiple aerial images of the target structure; automatically identify an aerial image from the multiple aerial images depicting a facet of the target structure that faces a street; and
transmit a second signal over the computer network, the signal indicative of the aerial image of the target structure depicting the facet of the target structure that faces the street.
14. The computer system of claim 13, wherein the processor executable code causes the processor to transmit a sequence of third signals depicting the multiple aerial images following the transmission of the second signal.
15. The computer system of claim 13, wherein the processor executable code causes the at least one processor to identify the facet of the target structure that faces the street by:
accessing a file identifying the street;
projecting a line through at least a portion of a facet of the target structure to the street; and
identifying the facet that the line is projected through as the facet of the target structure that faces the street.
16. The computer system of claim 13, wherein the processor executable code further causes the at least one processor to identify a rear facet of the target structure as a facet that is positioned at about 180° relative to the facet of the target structure that faces the street.
17. A computer system comprising at least one processor capable of executing processor executable code operably coupled with a non-transitory computer medium storing processor executable code, which when executed by the at least one processor causes the at least one processor to:
receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; in response to receiving the first signal, access a database including information about the target structure; and transmit a second signal over the computer network, the second signal indicative of a virtual property report for the target structure including at least one image of the target structure.
18. The computer system of claim 17, wherein the processor executable code when executed by the at least one processor causes the processor to determine using aerial imagery and at least one data set indicative of street files information about one or more of: a facet of the target structure facing a street, location of a main entrance of the target structure relative to the street, location of a secondary entrance of the target structure relative to the street, location of vehicle access to the target structure within the at least one image, and store the information in the database and associated with the target structure.
19. The computer system of claim 18, wherein the processor executable code for determining the information is organized to be executed by the at least one processor prior to executing the instructions to receive the first signal.
20. The computer system of claim 18 or 19, wherein the processor executable code when executed by the at least one processor causes the processor to access the information indicative of the facet of the target structure facing the street, and retrieve and display an image of the facet within the virtual property report.
21 . The computer system of any one of claims 17 to 20, wherein prior to the step of receiving the first signal over the computer network, the processor executable code when executed by the at least one processor causes the at least one processor to receive a selection of one or more pixels within a displayed image of the target structure in which the one or more pixels have pixel coordinates, transform the pixel coordinates into real-world geographic coordinates, measure distances between the real-world coordinates, and store the measurements on a non-transitory computer readable medium within the database and associated with the target structure.
22. The computer system of claim 21 , wherein the processor executable code when executed by the at least one processor causes the at least one processor to associate a label identifying the measurement with a particular measurement, and store the label with the measurement within the database and associated with the target structure.
23. The computer system of claim 22, wherein the label identifying the measurement is a particular field within the database.
24. The computer system of claim 22 or 23, wherein the label is selected from the group including orientation and area of a driveway depicted within the displayed image, size or area of a deck depicted within the displayed image, location and height of trees adjacent to the target structure, areas of windows of the target structure, area of a vertical or pitched surface on the target structure, a height of an eave of the target structure, a height of a chimney of the target structure, a distance to a church from the target structure.
25. The computer system of any one of claims 17 to 24, wherein the processor executable code when executed by the at least one processor causes the at least one processor to analyze a parcel database to determine an identity of neighboring parcels either adjacent to the target structure or within a predefined radius of the target structure, and to make available to a user information related to neighboring parcels when accessing information indicative of the target structure.
26. The computer system of claim 25, wherein the processor executable code when executed by the at least one processor causes the at least one processor to identify ownership or residency of neighboring parcels and to store a link within the database to information indicative of the owners or residents of neighboring parcels.
27. The computer system of claim 25 or 26, wherein the processor executable code when executed by the at least one processor causes the at least one processor to identify social media information of owners or residents of neighboring parcels and to store information within the database indicative of the social media information of the owners or residents of the neighboring parcels.
28. A non-transitory computer readable medium storing computer executable instructions that when executed by a processor cause the one or more processor to determine data indicative of one or more predetermined features of a target structure displayed in an image, the one or more predetermined features selected from: a facet of the target structure facing a street, location of a main entrance of the target structure relative to the street, location of a secondary entrance of the target structure relative to the street, orientation and area of a driveway of the target structure, location of vehicle access to the target structure, size and area of a deck of the target structure, real-world geographic location and height of a tree depicted in the image with the target structure, a geographic location of a trunk of a tree depicted in the image with the target structure, an area of a window of the target structure, an area of siding of the target structure depicted within multiple aerial images, a height of an eave of the target structure, height of a chimney of the target structure, a distance to one or more churches from the target structure, and social media information of a neighbor of the target structure.
29. A computer system comprising at least one processor capable of executing processor executable code operably coupled with a non-transitory computer medium storing processor executable code, which when executed by the at least one processor causes the at least one processor to:
receive a first signal over a computer network, the first signal indicative of a request for information about a target structure from a user; in response to receiving the first signal, access a database including multiple aerial images of the target structure;
automatically identify an aerial image from the multiple aerial images depicting a facet of the target structure that faces a street; and
transmit a second signal over the computer network, the signal indicative of the aerial image of the target structure depicting the facet of the target structure that faces the street.
30. The computer system of claim 29, wherein the processor executable code causes the processor to transmit a sequence of third signals depicting the multiple aerial images following the transmission of the second signal.
31. The computer system of claim 29 or 30, wherein the processor executable code causes the at least one processor to identify the facet of the target structure that faces the street by:
accessing a file identifying the street;
projecting a line through at least a portion of a facet of the target structure to the street; and
identifying the facet that the line is projected through as the facet of the target structure that faces the street.
32. The computer system of any one of claims 29, 30, or 31 , wherein the processor executable code further causes the at least one processor to identify a rear facet of the target structure as a facet that is positioned at about 180° relative to the facet of the target structure that faces the street.
PCT/US2014/024567 2013-03-15 2014-03-12 Virtual property reporting for automatic structure detection WO2014150927A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP14771085.9A EP2972953A4 (en) 2013-03-15 2014-03-12 Virtual property reporting for automatic structure detection
CA2906448A CA2906448C (en) 2013-03-15 2014-03-12 Virtual property reporting for automatic structure detection
MX2015012469A MX355657B (en) 2013-03-15 2014-03-12 Virtual property reporting for automatic structure detection.
AU2014235464A AU2014235464B2 (en) 2013-03-15 2014-03-12 Virtual property reporting for automatic structure detection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/840,258 2013-03-15
US13/840,258 US9753950B2 (en) 2013-03-15 2013-03-15 Virtual property reporting for automatic structure detection

Publications (1)

Publication Number Publication Date
WO2014150927A1 true WO2014150927A1 (en) 2014-09-25

Family

ID=51533228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/024567 WO2014150927A1 (en) 2013-03-15 2014-03-12 Virtual property reporting for automatic structure detection

Country Status (6)

Country Link
US (1) US9753950B2 (en)
EP (1) EP2972953A4 (en)
AU (1) AU2014235464B2 (en)
CA (1) CA2906448C (en)
MX (1) MX355657B (en)
WO (1) WO2014150927A1 (en)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140365709A1 (en) * 2013-06-10 2014-12-11 Jason Matthew Strauss Electronic computer program product and an electronic computer system for producing a location report
US8819038B1 (en) * 2013-10-06 2014-08-26 Yahoo! Inc. System and method for performing set operations with defined sketch accuracy distribution
US10055506B2 (en) 2014-03-18 2018-08-21 Excalibur Ip, Llc System and method for enhanced accuracy cardinality estimation
US20150310557A1 (en) * 2014-04-25 2015-10-29 State Farm Mutual Automobile Insurance Company System and Method for Intelligent Aerial Image Data Processing
US10019761B2 (en) 2014-04-25 2018-07-10 State Farm Mutual Automobile Insurance Company System and method for virtual inspection of a structure
US10755357B1 (en) * 2015-07-17 2020-08-25 State Farm Mutual Automobile Insurance Company Aerial imaging for insurance purposes
US10311302B2 (en) 2015-08-31 2019-06-04 Cape Analytics, Inc. Systems and methods for analyzing remote sensing imagery
US20170177748A1 (en) * 2015-12-16 2017-06-22 Wal-Mart Stores, Inc. Residential Upgrade Design Tool
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods
US11153310B2 (en) * 2016-04-21 2021-10-19 Signify Holding B.V. Systems and methods for registering and localizing building servers for cloud-based monitoring and control of physical environments
WO2019014406A1 (en) * 2017-07-12 2019-01-17 Zodiac Pool Systems Llc Systems and methods for mapping or otherwise discerning characteristics of swimming pools and spas
US10984231B1 (en) * 2017-11-10 2021-04-20 ThoughtTrace, Inc. Rights mapping system and method
US11023985B1 (en) * 2017-11-16 2021-06-01 State Farm Mutual Automobile Insurance Company Systems and methods for executing a customized home search
US11151669B1 (en) 2017-11-16 2021-10-19 State Farm Mutual Automobile Insurance Company Systems and methods for identifying hidden home maintenance costs
US10825241B2 (en) * 2018-03-16 2020-11-03 Microsoft Technology Licensing, Llc Using a one-dimensional ray sensor to map an environment
US10664673B2 (en) 2018-03-29 2020-05-26 Midlab, Inc. Training system for use with janitorial and cleaning products
US10990777B1 (en) * 2018-03-29 2021-04-27 Midlab, Inc. Method for interactive training in the cleaning of a room
US11704731B2 (en) * 2018-04-11 2023-07-18 Hartford Fire Insurance Company Processing system to generate risk scores for electronic records
US10830473B2 (en) 2018-05-14 2020-11-10 Johnson Controls Technology Company Systems and methods for zoning system setup
US20190354537A1 (en) * 2018-05-17 2019-11-21 James A. La Maire System and Method for Increasing the Speed at Which Critical Safety Information is Provided or Accessed
US11810202B1 (en) 2018-10-17 2023-11-07 State Farm Mutual Automobile Insurance Company Method and system for identifying conditions of features represented in a virtual model
US11024099B1 (en) 2018-10-17 2021-06-01 State Farm Mutual Automobile Insurance Company Method and system for curating a virtual model for feature identification
US11556995B1 (en) * 2018-10-17 2023-01-17 State Farm Mutual Automobile Insurance Company Predictive analytics for assessing property using external data
US10873724B1 (en) 2019-01-08 2020-12-22 State Farm Mutual Automobile Insurance Company Virtual environment generation for collaborative building assessment
US11107292B1 (en) * 2019-04-03 2021-08-31 State Farm Mutual Automobile Insurance Company Adjustable virtual scenario-based training environment
US11049072B1 (en) 2019-04-26 2021-06-29 State Farm Mutual Automobile Insurance Company Asynchronous virtual collaboration environments
US11032328B1 (en) 2019-04-29 2021-06-08 State Farm Mutual Automobile Insurance Company Asymmetric collaborative virtual environments
US11847937B1 (en) * 2019-04-30 2023-12-19 State Farm Mutual Automobile Insurance Company Virtual multi-property training environment
US11734469B2 (en) * 2019-10-24 2023-08-22 Home Outside, Inc System and method for generating a landscape design
KR20220139431A (en) * 2020-03-02 2022-10-14 구글 엘엘씨 Topology base model supporting improved merging and stable feature identity
WO2021207733A1 (en) 2020-04-10 2021-10-14 Cape Analytics, Inc. System and method for geocoding
US11288533B2 (en) * 2020-05-27 2022-03-29 Verizon Patent And Licensing Inc. Systems and methods for identifying a service qualification of a unit of a community
CA3180114C (en) 2020-06-02 2023-08-29 Fabian RICHTER Method for property feature segmentation
WO2022082007A1 (en) 2020-10-15 2022-04-21 Cape Analytics, Inc. Method and system for automated debris detection
US11854257B2 (en) * 2021-05-18 2023-12-26 Here Global B.V. Identifying canopies
WO2023283231A1 (en) 2021-07-06 2023-01-12 Cape Analytics, Inc. System and method for property condition analysis
US11861843B2 (en) 2022-01-19 2024-01-02 Cape Analytics, Inc. System and method for object analysis

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004111973A1 (en) * 2003-06-12 2004-12-23 Denso Corporation Image server, image collection device, and image display terminal
JP2006119797A (en) * 2004-10-20 2006-05-11 Sony Ericsson Mobilecommunications Japan Inc Information providing system and mobile temrinal
WO2012112009A2 (en) * 2011-02-18 2012-08-23 Samsung Electronics Co., Ltd. Method and mobile apparatus for displaying an augmented reality
JP2012209833A (en) * 2011-03-30 2012-10-25 Panasonic Corp Facility information display device and facility information display system

Family Cites Families (176)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2273876A (en) 1940-02-12 1942-02-24 Frederick W Lutz Apparatus for indicating tilt of cameras
US3153784A (en) 1959-12-24 1964-10-20 Us Industries Inc Photo radar ground contour mapping system
US5345086A (en) 1962-11-28 1994-09-06 Eaton Corporation Automatic map compilation system
US3621326A (en) 1968-09-30 1971-11-16 Itek Corp Transformation system
US3594556A (en) 1969-01-08 1971-07-20 Us Navy Optical sight with electronic image stabilization
US3661061A (en) 1969-05-05 1972-05-09 Atomic Energy Commission Picture position finder
US3614410A (en) 1969-06-12 1971-10-19 Knight V Bailey Image rectifier
US3716669A (en) 1971-05-14 1973-02-13 Japan Eng Dev Co Mapping rectifier for generating polarstereographic maps from satellite scan signals
US3725563A (en) 1971-12-23 1973-04-03 Singer Co Method of perspective transformation in scanned raster visual display
US3864513A (en) 1972-09-11 1975-02-04 Grumman Aerospace Corp Computerized polarimetric terrain mapping system
US4015080A (en) 1973-04-30 1977-03-29 Elliott Brothers (London) Limited Display devices
JPS5223975Y2 (en) 1973-05-29 1977-05-31
US3877799A (en) 1974-02-06 1975-04-15 United Kingdom Government Method of recording the first frame in a time index system
DE2510044A1 (en) 1975-03-07 1976-09-16 Siemens Ag ARRANGEMENT FOR RECORDING CHARACTERS USING MOSAIC PENCILS
US4707698A (en) 1976-03-04 1987-11-17 Constant James N Coordinate measurement and radar device using image scanner
US4240108A (en) 1977-10-03 1980-12-16 Grumman Aerospace Corporation Vehicle controlled raster display system
JPS5637416Y2 (en) 1977-10-14 1981-09-02
IT1095061B (en) 1978-05-19 1985-08-10 Conte Raffaele EQUIPMENT FOR MAGNETIC REGISTRATION OF CASUAL EVENTS RELATED TO MOBILE VEHICLES
US4396942A (en) 1979-04-19 1983-08-02 Jackson Gates Video surveys
FR2461305B1 (en) 1979-07-06 1985-12-06 Thomson Csf MAP INDICATOR SYSTEM MORE PARTICULARLY FOR AIR NAVIGATION
DE2939681A1 (en) 1979-09-29 1981-04-30 Agfa-Gevaert Ag, 5090 Leverkusen METHOD AND DEVICE FOR MONITORING THE QUALITY IN THE PRODUCTION OF PHOTOGRAPHIC IMAGES
DE2940871C2 (en) 1979-10-09 1983-11-10 Messerschmitt-Bölkow-Blohm GmbH, 8012 Ottobrunn Photogrammetric method for aircraft and spacecraft for digital terrain display
US4387056A (en) 1981-04-16 1983-06-07 E. I. Du Pont De Nemours And Company Process for separating zero-valent nickel species from divalent nickel species
US4382678A (en) 1981-06-29 1983-05-10 The United States Of America As Represented By The Secretary Of The Army Measuring of feature for photo interpretation
US4463380A (en) 1981-09-25 1984-07-31 Vought Corporation Image processing system
US4495500A (en) 1982-01-26 1985-01-22 Sri International Topographic data gathering method
US4490742A (en) 1982-04-23 1984-12-25 Vcs, Incorporated Encoding apparatus for a closed circuit television system
US4586138A (en) 1982-07-29 1986-04-29 The United States Of America As Represented By The United States Department Of Energy Route profile analysis system and method
US4491399A (en) 1982-09-27 1985-01-01 Coherent Communications, Inc. Method and apparatus for recording a digital signal on motion picture film
US4527055A (en) 1982-11-15 1985-07-02 Honeywell Inc. Apparatus for selectively viewing either of two scenes of interest
FR2536851B1 (en) 1982-11-30 1985-06-14 Aerospatiale RECOGNITION SYSTEM COMPRISING AN AIR VEHICLE TURNING AROUND ITS LONGITUDINAL AXIS
US4489322A (en) 1983-01-27 1984-12-18 The United States Of America As Represented By The Secretary Of The Air Force Radar calibration using direct measurement equipment and oblique photometry
US4635136A (en) 1984-02-06 1987-01-06 Rochester Institute Of Technology Method and apparatus for storing a massive inventory of labeled images
US4686474A (en) 1984-04-05 1987-08-11 Deseret Research, Inc. Survey system for collection and real time processing of geophysical data
US4814711A (en) 1984-04-05 1989-03-21 Deseret Research, Inc. Survey system and method for real time collection and processing of geophysicals data using signals from a global positioning satellite network
US4673988A (en) 1985-04-22 1987-06-16 E.I. Du Pont De Nemours And Company Electronic mosaic imaging process
US4653136A (en) 1985-06-21 1987-03-31 Denison James W Wiper for rear view mirror
EP0211623A3 (en) 1985-08-01 1988-09-21 British Aerospace Public Limited Company Identification of ground targets in airborne surveillance radar returns
US4953227A (en) 1986-01-31 1990-08-28 Canon Kabushiki Kaisha Image mosaic-processing method and apparatus
US4653316A (en) 1986-03-14 1987-03-31 Kabushiki Kaisha Komatsu Seisakusho Apparatus mounted on vehicles for detecting road surface conditions
US4688092A (en) 1986-05-06 1987-08-18 Ford Aerospace & Communications Corporation Satellite camera image navigation
US4956872A (en) 1986-10-31 1990-09-11 Canon Kabushiki Kaisha Image processing apparatus capable of random mosaic and/or oil-painting-like processing
JPS63202182A (en) 1987-02-18 1988-08-22 Olympus Optical Co Ltd Tilted dot pattern forming method
US4814896A (en) 1987-03-06 1989-03-21 Heitzman Edward F Real time video data acquistion systems
US5164825A (en) 1987-03-30 1992-11-17 Canon Kabushiki Kaisha Image processing method and apparatus for mosaic or similar processing therefor
US4807024A (en) 1987-06-08 1989-02-21 The University Of South Carolina Three-dimensional display methods and apparatus
US4899296A (en) 1987-11-13 1990-02-06 Khattak Anwar S Pavement distress survey system
US4843463A (en) 1988-05-23 1989-06-27 Michetti Joseph A Land vehicle mounted audio-visual trip recorder
GB8826550D0 (en) 1988-11-14 1989-05-17 Smiths Industries Plc Image processing apparatus and methods
US4906198A (en) 1988-12-12 1990-03-06 International Business Machines Corporation Circuit board assembly and contact pin for use therein
JP2765022B2 (en) 1989-03-24 1998-06-11 キヤノン販売株式会社 3D image forming device
US5617224A (en) 1989-05-08 1997-04-01 Canon Kabushiki Kaisha Imae processing apparatus having mosaic processing feature that decreases image resolution without changing image size or the number of pixels
US5086314A (en) 1990-05-21 1992-02-04 Nikon Corporation Exposure control apparatus for camera
JPH0316377A (en) 1989-06-14 1991-01-24 Kokusai Denshin Denwa Co Ltd <Kdd> Method and apparatus for reducing binary picture
US5166789A (en) 1989-08-25 1992-11-24 Space Island Products & Services, Inc. Geographical surveying using cameras in combination with flight computers to obtain images with overlaid geographical coordinates
JP3147358B2 (en) 1990-02-23 2001-03-19 ミノルタ株式会社 Camera that can record location data
US5335072A (en) 1990-05-30 1994-08-02 Minolta Camera Kabushiki Kaisha Photographic system capable of storing information on photographed image data
EP0464263A3 (en) 1990-06-27 1992-06-10 Siemens Aktiengesellschaft Device for obstacle detection for pilots of low flying aircrafts
US5191174A (en) 1990-08-01 1993-03-02 International Business Machines Corporation High density circuit board and method of making same
US5200793A (en) 1990-10-24 1993-04-06 Kaman Aerospace Corporation Range finding array camera
US5155597A (en) 1990-11-28 1992-10-13 Recon/Optical, Inc. Electro-optical imaging array with motion compensation
JPH04250436A (en) 1991-01-11 1992-09-07 Pioneer Electron Corp Image pickup device
US5265173A (en) 1991-03-20 1993-11-23 Hughes Aircraft Company Rectilinear object image matcher
US5369443A (en) 1991-04-12 1994-11-29 Abekas Video Systems, Inc. Digital video effects generator
CA2066280C (en) 1991-04-16 1997-12-09 Masaru Hiramatsu Image pickup system with a image pickup device for control
US5555018A (en) 1991-04-25 1996-09-10 Von Braun; Heiko S. Large-scale mapping of parameters of multi-dimensional structures in natural environments
US5231435A (en) 1991-07-12 1993-07-27 Blakely Bruce W Aerial camera mounting apparatus
EP0530391B1 (en) 1991-09-05 1996-12-11 Nec Corporation Image pickup system capable of producing correct image signals of an object zone
US5677515A (en) 1991-10-18 1997-10-14 Trw Inc. Shielded multilayer printed wiring board, high frequency, high isolation
US5402170A (en) 1991-12-11 1995-03-28 Eastman Kodak Company Hand-manipulated electronic camera tethered to a personal computer
US5247356A (en) 1992-02-14 1993-09-21 Ciampa John A Method and apparatus for mapping and measuring land
US5270756A (en) 1992-02-18 1993-12-14 Hughes Training, Inc. Method and apparatus for generating high resolution vidicon camera images
US5251037A (en) 1992-02-18 1993-10-05 Hughes Training, Inc. Method and apparatus for generating high resolution CCD camera images
US5506644A (en) 1992-08-18 1996-04-09 Olympus Optical Co., Ltd. Camera
US5481479A (en) 1992-12-10 1996-01-02 Loral Fairchild Corp. Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance
US5342999A (en) 1992-12-21 1994-08-30 Motorola, Inc. Apparatus for adapting semiconductor die pads and method therefor
US5414462A (en) 1993-02-11 1995-05-09 Veatch; John W. Method and apparatus for generating a comprehensive survey map
US5508736A (en) 1993-05-14 1996-04-16 Cooper; Roger D. Video signal processing apparatus for producing a composite signal for simultaneous display of data and video information
US5467271A (en) 1993-12-17 1995-11-14 Trw, Inc. Mapping and analysis system for precision farming applications
DE69532126T2 (en) 1994-05-19 2004-07-22 Geospan Corp., Plymouth METHOD FOR COLLECTING AND PROCESSING VISUAL AND SPATIAL POSITION INFORMATION
RU2153700C2 (en) 1995-04-17 2000-07-27 Спейс Системз/Лорал, Инк. Orientation and image shaping control system (design versions)
US5604534A (en) 1995-05-24 1997-02-18 Omni Solutions International, Ltd. Direct digital airborne panoramic camera system and method
US5668593A (en) 1995-06-07 1997-09-16 Recon/Optical, Inc. Method and camera system for step frame reconnaissance with motion compensation
US5963664A (en) 1995-06-22 1999-10-05 Sarnoff Corporation Method and system for image combination using a parallax-based technique
US5794216A (en) 1995-07-14 1998-08-11 Brown; Timothy Robert Methods and system for data acquisition in a multimedia real estate database
US5835133A (en) 1996-01-23 1998-11-10 Silicon Graphics, Inc. Optical system for single camera stereo video
US5852810A (en) 1996-01-29 1998-12-22 Student Housing Network Geographic specific information search system and method
US5894323A (en) 1996-03-22 1999-04-13 Tasc, Inc, Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data
US5844602A (en) 1996-05-07 1998-12-01 Recon/Optical, Inc. Electro-optical imaging array and camera system with pitch rate image motion compensation which can be used in an airplane in a dive bomb maneuver
US5798786A (en) 1996-05-07 1998-08-25 Recon/Optical, Inc. Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions
US5841574A (en) 1996-06-28 1998-11-24 Recon/Optical, Inc. Multi-special decentered catadioptric optical system
US6108032A (en) 1996-11-05 2000-08-22 Lockheed Martin Fairchild Systems System and method for image motion compensation of a CCD image sensor
EP0937230B1 (en) 1996-11-05 2003-04-09 BAE SYSTEMS Information and Electronic Systems Integration Inc. Electro-optical reconnaissance system with forward motion compensation
RU2127075C1 (en) 1996-12-11 1999-03-10 Корженевский Александр Владимирович Method for producing tomographic image of body and electrical-impedance tomographic scanner
US6222583B1 (en) 1997-03-27 2001-04-24 Nippon Telegraph And Telephone Corporation Device and system for labeling sight images
US6597818B2 (en) 1997-05-09 2003-07-22 Sarnoff Corporation Method and apparatus for performing geo-spatial registration of imagery
US6097854A (en) 1997-08-01 2000-08-01 Microsoft Corporation Image mosaic construction system and apparatus with patch-based alignment, global block adjustment and pair-wise motion-based local warping
US6157747A (en) 1997-08-01 2000-12-05 Microsoft Corporation 3-dimensional image rotation method and apparatus for producing image mosaics
AU9783798A (en) 1997-10-06 1999-04-27 John A. Ciampa Digital-image mapping
US5852753A (en) 1997-11-10 1998-12-22 Lo; Allen Kwok Wah Dual-lens camera with shutters for taking dual or single images
WO1999024936A1 (en) 1997-11-10 1999-05-20 Gentech Corporation System and method for generating super-resolution-enhanced mosaic images
US6037945A (en) 1997-12-16 2000-03-14 Xactware, Inc. Graphical method for modeling and estimating construction costs
US6094215A (en) 1998-01-06 2000-07-25 Intel Corporation Method of determining relative camera orientation position to create 3-D visual images
US6130705A (en) 1998-07-10 2000-10-10 Recon/Optical, Inc. Autonomous electro-optical framing camera system with constant ground resolution, unmanned airborne vehicle therefor, and methods of use
JP4245699B2 (en) 1998-09-16 2009-03-25 オリンパス株式会社 Imaging device
US6434265B1 (en) 1998-09-25 2002-08-13 Apple Computers, Inc. Aligning rectilinear images in 3D through projective registration and calibration
DE19857667A1 (en) 1998-12-15 2000-08-17 Aerowest Photogrammetrie H Ben Process for creating a three-dimensional object description
US6167300A (en) 1999-03-08 2000-12-26 Tci Incorporated Electric mammograph
DE19922341C2 (en) 1999-05-14 2002-08-29 Zsp Geodaetische Sys Gmbh Method and arrangement for determining the spatial coordinates of at least one object point
AUPQ056099A0 (en) 1999-05-25 1999-06-17 Silverbrook Research Pty Ltd A method and apparatus (pprint01)
JP5210473B2 (en) 1999-06-21 2013-06-12 株式会社半導体エネルギー研究所 Display device
US6639596B1 (en) 1999-09-20 2003-10-28 Microsoft Corporation Stereo reconstruction from multiperspective panoramas
WO2001048683A1 (en) 1999-12-29 2001-07-05 Geospan Corporation Any aspect passive volumetric image processing method
US6829584B2 (en) 1999-12-31 2004-12-07 Xactware, Inc. Virtual home data repository and directory
US6826539B2 (en) 1999-12-31 2004-11-30 Xactware, Inc. Virtual structure data repository and directory
US6810383B1 (en) 2000-01-21 2004-10-26 Xactware, Inc. Automated task management and evaluation
AU3047801A (en) 2000-02-03 2001-08-14 Alst Technical Excellence Center Image resolution improvement using a color mosaic sensor
AU2001271238A1 (en) 2000-03-16 2001-09-24 The Johns-Hopkins University Light detection and ranging (lidar) mapping system
IL151951A0 (en) 2000-03-29 2003-04-10 Astrovision International Inc Direct broadcast imaging satellite system, apparatus and method for providing real-time, continuous monitoring of earth from geostationary earth orbit and related services
US7038681B2 (en) 2000-03-29 2006-05-02 Sourceprose Corporation System and method for georeferencing maps
US7184072B1 (en) 2000-06-15 2007-02-27 Power View Company, L.L.C. Airborne inventory and inspection system and apparatus
US6834128B1 (en) 2000-06-16 2004-12-21 Hewlett-Packard Development Company, L.P. Image mosaicing system and method adapted to mass-market hand-held digital cameras
US6484101B1 (en) 2000-08-16 2002-11-19 Imagelinks, Inc. 3-dimensional interactive image modeling system
US7313289B2 (en) 2000-08-30 2007-12-25 Ricoh Company, Ltd. Image processing method and apparatus and computer-readable storage medium using improved distortion correction
US6421610B1 (en) 2000-09-15 2002-07-16 Ernest A. Carroll Method of preparing and disseminating digitized geospatial data
US20090132316A1 (en) * 2000-10-23 2009-05-21 Costar Group, Inc. System and method for associating aerial images, map features, and information
US6959120B1 (en) 2000-10-27 2005-10-25 Microsoft Corporation Rebinning methods and arrangements for use in compressing image-based rendering (IBR) data
AU2002308651A1 (en) 2001-05-04 2002-11-18 Leberl, Franz, W. Digital camera for and method of obtaining overlapping images
US7046401B2 (en) 2001-06-01 2006-05-16 Hewlett-Packard Development Company, L.P. Camera-based document scanning system using multiple-pass mosaicking
US7509241B2 (en) 2001-07-06 2009-03-24 Sarnoff Corporation Method and apparatus for automatically generating a site model
US20030043824A1 (en) 2001-08-31 2003-03-06 Remboski Donald J. Vehicle active network and device
US6747686B1 (en) 2001-10-05 2004-06-08 Recon/Optical, Inc. High aspect stereoscopic mode camera and method
US7262790B2 (en) 2002-01-09 2007-08-28 Charles Adams Bakewell Mobile enforcement platform with aimable violation identification and documentation system for multiple traffic violation types across all lanes in moving traffic, generating composite display images and data to support citation generation, homeland security, and monitoring
US7092957B2 (en) * 2002-01-18 2006-08-15 Boundary Solutions Incorporated Computerized national online parcel-level map data portal
TW550521B (en) 2002-02-07 2003-09-01 Univ Nat Central Method for re-building 3D model of house in a semi-automatic manner using edge segments of buildings
JP4184703B2 (en) 2002-04-24 2008-11-19 大日本印刷株式会社 Image correction method and system
US7127348B2 (en) 2002-09-20 2006-10-24 M7 Visual Intelligence, Lp Vehicle based data collection and processing system
US7725258B2 (en) 2002-09-20 2010-05-25 M7 Visual Intelligence, L.P. Vehicle based data collection and processing system and imaging sensor system and methods thereof
US7424133B2 (en) 2002-11-08 2008-09-09 Pictometry International Corporation Method and apparatus for capturing, geolocating and measuring oblique images
EP1696204B1 (en) 2002-11-08 2015-01-28 Pictometry International Corp. Method for capturing, geolocating and measuring oblique images
US7636901B2 (en) * 2003-06-27 2009-12-22 Cds Business Mapping, Llc System for increasing accuracy of geocode data
US7018050B2 (en) 2003-09-08 2006-03-28 Hewlett-Packard Development Company, L.P. System and method for correcting luminance non-uniformity of obliquely projected images
JP2005151536A (en) 2003-10-23 2005-06-09 Nippon Dempa Kogyo Co Ltd Crystal oscillator
US7916940B2 (en) 2004-01-31 2011-03-29 Hewlett-Packard Development Company Processing of mosaic digital images
WO2005088251A1 (en) 2004-02-27 2005-09-22 Intergraph Software Technologies Company Forming a single image from overlapping images
US20050273346A1 (en) * 2004-06-02 2005-12-08 Frost Richard N Real property information management system and method
US9213461B2 (en) 2004-06-16 2015-12-15 Redfin Corporation Web-based real estate mapping system
US20060028550A1 (en) 2004-08-06 2006-02-09 Palmer Robert G Jr Surveillance system and method
US7728833B2 (en) 2004-08-18 2010-06-01 Sarnoff Corporation Method for generating a three-dimensional model of a roof structure
US8207964B1 (en) * 2008-02-22 2012-06-26 Meadow William D Methods and apparatus for generating three-dimensional image data models
US8078396B2 (en) 2004-08-31 2011-12-13 Meadow William D Methods for and apparatus for generating a continuum of three dimensional image data
US7348895B2 (en) 2004-11-03 2008-03-25 Lagassey Paul J Advanced automobile accident detection, data recordation and reporting system
US7142984B2 (en) 2005-02-08 2006-11-28 Harris Corporation Method and apparatus for enhancing a digital elevation model (DEM) for topographical modeling
US7466244B2 (en) 2005-04-21 2008-12-16 Microsoft Corporation Virtual earth rooftop overlay and bounding
US7554539B2 (en) 2005-07-27 2009-06-30 Balfour Technologies Llc System for viewing a collection of oblique imagery in a three or four dimensional virtual scene
US20070218900A1 (en) * 2006-03-17 2007-09-20 Raj Vasant Abhyanker Map based neighborhood search and community contribution
US7844499B2 (en) 2005-12-23 2010-11-30 Sharp Electronics Corporation Integrated solar agent business model
US7778491B2 (en) 2006-04-10 2010-08-17 Microsoft Corporation Oblique image stitching
US8538676B2 (en) * 2006-06-30 2013-09-17 IPointer, Inc. Mobile geographic information system and method
US7873238B2 (en) 2006-08-30 2011-01-18 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US20080077458A1 (en) 2006-09-19 2008-03-27 Andersen Timothy J Collecting and representing home attributes
DE102007030781A1 (en) 2006-10-11 2008-04-17 Gta Geoinformatik Gmbh Method for texturing virtual three-dimensional objects
IL179344A (en) 2006-11-16 2014-02-27 Rafael Advanced Defense Sys Method for tracking a moving platform
US7832267B2 (en) 2007-04-25 2010-11-16 Ecometriks, Llc Method for determining temporal solar irradiance values
WO2009025928A2 (en) 2007-06-19 2009-02-26 Ch2M Hill, Inc. Systems and methods for solar mapping, determining a usable area for solar energy production and/or providing solar information
US8417061B2 (en) 2008-02-01 2013-04-09 Sungevity Inc. Methods and systems for provisioning energy systems
US8275194B2 (en) 2008-02-15 2012-09-25 Microsoft Corporation Site modeling using image data fusion
CN101978395B (en) 2008-04-23 2012-10-03 株式会社博思科 Building roof outline recognizing device, and building roof outline recognizing method
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US9330494B2 (en) * 2009-10-26 2016-05-03 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US8977520B2 (en) 2010-10-21 2015-03-10 Pictometry International Corp. Computer system for automatically classifying roof elements
US20130036031A1 (en) * 2011-08-02 2013-02-07 South Central Planning and Development Commission System for monitoring land use activities
US8774525B2 (en) * 2012-02-03 2014-07-08 Eagle View Technologies, Inc. Systems and methods for estimation of building floor area
US10684753B2 (en) * 2012-03-28 2020-06-16 The Travelers Indemnity Company Systems and methods for geospatial value subject analysis and management
US20130262152A1 (en) * 2012-03-28 2013-10-03 The Travelers Indemnity Company Systems and methods for certified location data collection, management, and utilization
US9959581B2 (en) * 2013-03-15 2018-05-01 Eagle View Technologies, Inc. Property management on a smartphone

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004111973A1 (en) * 2003-06-12 2004-12-23 Denso Corporation Image server, image collection device, and image display terminal
JP2006119797A (en) * 2004-10-20 2006-05-11 Sony Ericsson Mobilecommunications Japan Inc Information providing system and mobile temrinal
WO2012112009A2 (en) * 2011-02-18 2012-08-23 Samsung Electronics Co., Ltd. Method and mobile apparatus for displaying an augmented reality
JP2012209833A (en) * 2011-03-30 2012-10-25 Panasonic Corp Facility information display device and facility information display system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2972953A4 *

Also Published As

Publication number Publication date
MX2015012469A (en) 2016-02-05
CA2906448A1 (en) 2014-09-25
MX355657B (en) 2018-04-26
EP2972953A4 (en) 2016-11-02
AU2014235464A1 (en) 2015-09-24
US20140280269A1 (en) 2014-09-18
CA2906448C (en) 2021-05-25
AU2014235464B2 (en) 2019-11-07
US9753950B2 (en) 2017-09-05
EP2972953A1 (en) 2016-01-20

Similar Documents

Publication Publication Date Title
AU2014235464B2 (en) Virtual property reporting for automatic structure detection
AU2018206829B2 (en) Method and system for quick square roof reporting
US11620714B2 (en) Systems and methods for estimation of building floor area
US10528960B2 (en) Aerial roof estimation system and method
US8977520B2 (en) Computer system for automatically classifying roof elements
CA2901448C (en) Systems and methods for performing a risk management assessment of a property
Ning et al. Exploring the vertical dimension of street view image based on deep learning: a case study on lowest floor elevation estimation
US20230419430A1 (en) Systems, methods and apparatus for property defect management
US20180357720A1 (en) Detection of Real Estate Development Construction Activity
Griffith-Charles et al. Capturing Legal and Physical Boundary Differences in 3D Space–A Case Study of Trinidad and Tobago
KR101640444B1 (en) GIS based Surpportive Toolkits for Merging the Lots
Lewis et al. Coefficients for Estimating Landscape Area on Single‐Family Residential Lots
Oye Mass retrofitting of an energy efficient low carbon zone
Noh Residential housing markets before and after land use changes
Rahman Developing a semi/automated protocol to post-process large volume, High-resolution airborne thermal infrared (TIR) imagery for urban waste heat mapping

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14771085

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2014771085

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: MX/A/2015/012469

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2906448

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2014235464

Country of ref document: AU

Date of ref document: 20140312

Kind code of ref document: A