(Part II) The increased affordability, and perceived ease of use, of technologies developed for engineering and remote sensing have led to their widespread adoption in archaeology over the past five years. In particular, low cost 3D image-based modelling solutions like Agisoft PhotoScan have made photogrammetry and “Structure from Motion” (SfM) buzzwords of the moment. The ability of such software to ingest relatively unstructured images, which lack any deliberate camera network design other than high overlap, and to yet produce visually pleasing meshed surfaces is an astonishing application of computer-vision stereo-matching algorithms originally intended for basic scene reconstruction. Indeed, some software packages or online services purport to offer high-accuracy finished meshes without any user intervention whatsoever. But as the noted Australian photogrammetrist Clive Fraser argued in an editorial in the Photogrammetric Record (30(149), March 2015, 3-7) no purely SfM commercial software packages have made it to the commercial market, nor are they any packages that can produce results of the highest orders of accuracy without significant user intervention. In order to achieve accuracy and repeatability the non-linear “thick lens” camera model used by photogrammetric engineers has had to be integrated. His editorial is at once an informative overview of a field that has witnessed continuous development from the early efforts of Albrecht Maydenbauer in the late 19th century to conduct photographic surveys of built heritage to millimetric levels of accuracy, but also a warning: without the firm grasp of the underlying principles of photogrammetric engineering, the enthusiastic adoption of techniques of computer-vision techniques can undermine decades of efforts to establish standards of photogrammetric accuracy. The rapidity of the adoption of SfM recording methods, largely, it seems because of their ease of use, has led archaeologists into dangerous territory, especially when these new technologies have wholly supplanted time-tested manual methods. Without precisely the quality control and quality assurance processes that are routinely followed in industry with these same technologies, the reliability, repeatability and long-term value of such 3D data-products may be in jeopardy. This session will consider how established QA/QC regimes can rectify this situation. These may include robust methods of camera calibration, the use of ground-control points and check-points to validate model accuracy, the archiving of photogrammetric data-sets, accurate drawing from 3D data, appropriate file-formats, the history of photogrammetry in archaeology, and discussion of what constitutes a final photogrammetric product in archaeology. The session will challenge the participants and audience to look beyond mere “visualization” as the exclusive function of SfM and image-based modelling to reliable methods that can match or exceed the quality and repeatability of traditional methods of documentation. This session will also welcome contributions from users of laser scanning as well. The advent of more affordable systems like the FARO Focus3D has led to increased adoption of laser scanning technologies in research and teaching environments. In part, due to the costing and financial structure of institutions in relation to the costing / price point of equipment. Not every laser scanner, however, is the same. In fact, there are parallels with the case of photogrammetry and computer-vision. For example, there has never been an established standard to adequately compare laser systems between manufacturers. The immediate point of reference is the manufacturer's specificifications, which are determined using different methods between manufacturer. It has always been down to the end user, research institutions like i3Mainz or laboratories, such as the Seibersdorf Laboratory in Austria, to generate comparison reports. The laser scanner part of the session welcomes case studies based around the difference in arcseconds accuracies and precision between laser scanners; experiences in using E57 as a universal file format versus .LAS; as well as best practice in target vs targetless based workflows.