2014 Commits

Author SHA1 Message Date
David Williams
64c30044b0 Moving some macros. 2015-05-08 14:50:10 +02:00
David Williams
f16a247934 Changed implementation of logging macros.
We have observed some strange performance-related behavior as described here: https://stackoverflow.com/questions/29652787/adding-stringstream-cout-hurts-performance-even-when-the-code-is-never-called

This set of changes addresses this problem. The old macros would simply expand the logging code in place, whereas we now have logging functions and the macros call these. Overall I believe it is tidier.
2015-05-07 22:58:00 +02:00
David Williams
4dadbbffd1 Added comment about performance for the future. 2015-04-26 09:25:57 +02:00
David Williams
1d925a59a1 Fixed crash. 2015-04-16 16:47:12 +02:00
David Williams
9947425169 Fix for code which determines which old chunk to delete. 2015-04-15 16:58:24 +02:00
David Williams
cd752b4459 Merge branch 'feature/custom-chunk-hash-table' into develop 2015-04-14 23:43:41 +02:00
David Williams
6ff7b46e26 Merge branch 'develop' into feature/custom-chunk-hash-table 2015-04-14 23:30:29 +02:00
Matt Williams
32c30471a6 Make m_uChunkSideLengthMinusOne const 2015-04-14 15:01:10 +01:00
David Williams
12fdeb8e52 Removed old chunk map.
Removed flush(Region) function as it's a bit trickier to implement with the new hash table, and it's not clear that we need it.
2015-04-13 23:51:18 +02:00
David Williams
1e0e8a8c16 Fixed calculation of volume size in bytes. 2015-04-13 23:48:33 +02:00
David Williams
f7c1962773 Removed commented-out code. 2015-04-13 23:32:23 +02:00
David Williams
143c9fd08d Made test 10x longer. 2015-04-13 21:34:59 +02:00
David Williams
37c35a08db Added code to ensure the number of chunks doesn't go over our target limit. 2015-04-13 21:30:59 +02:00
David Williams
8757f1e53e Removed unneeded assert. 2015-04-13 21:17:19 +02:00
David Williams
5dd46c4bcf Merge branch 'develop' into feature/custom-chunk-hash-table 2015-04-13 21:07:48 +02:00
David Williams
64be18cd14 Tidied up loop for inserting chunk into array. 2015-04-12 20:55:49 +02:00
David Williams
af70096fcc Tidying and adding comments. 2015-04-12 16:46:43 +02:00
David Williams
99390580dd Replaced number with constant. 2015-04-12 10:35:12 +02:00
David Williams
c4cccf9043 Replaced double for loop with cleaner do-while loop. 2015-04-12 09:55:30 +02:00
David Williams
f35581506c Minor optimization - only creating vector if we are going to use it. 2015-04-12 09:42:15 +02:00
David Williams
54903150e9 Merge branch 'develop' into feature/custom-chunk-hash-table 2015-04-12 09:19:14 +02:00
David Williams
c562341db0 Added a second PagedVolume to the tests with much higher allowed memory usage. This makes more sense when testing random access, as low permitted memory usage causes disk IO to become the bottleneck. 2015-04-10 16:56:19 +02:00
David Williams
b90f0d4e15 Made the FilePager a little more robust regarding filename conflicts. 2015-04-10 16:47:50 +02:00
David Williams
8bd013f28e Added RawVolume version of test as well. 2015-04-10 16:14:29 +02:00
David Williams
887ecc1aaa Adding test to measure voxel access times when sampling the volume randomly. 2015-04-10 16:09:35 +02:00
David Williams
a2fe1944af Initial work on replacing std::unordered_map with a specialized hash table for looking up chunks based on their 3D position. 2015-04-09 23:44:25 +02:00
David Williams
27a59f34bc Merge branch 'feature/morton-encoding' into develop 2015-04-05 17:44:27 +02:00
David Williams
4c24d61408 Added another function for backwards compatibility. 2015-04-05 12:03:12 +02:00
David Williams
c887d1444f Added utility function for people who already have data in linear order, to convert it to Morton order. 2015-04-05 10:14:25 +02:00
David Williams
d521b08cf9 Added comment. 2015-04-04 09:57:31 +02:00
David Williams
dec06bcfe4 Added caching of variable. 2015-04-04 09:49:12 +02:00
David Williams
77db90ac30 Removed unneeded variable. 2015-04-04 09:42:46 +02:00
David Williams
0d36c416f2 Tidied up macros. 2015-04-04 09:18:51 +02:00
David Williams
3ca0222b19 Applied simplified test when going in the negative direction as well. 2015-04-04 00:08:20 +02:00
David Williams
d1bcaec2c5 This commit knocks about 30% off the run time of the sampler tests by using a more efficient check for whether we are near the edge of the chunk. 2015-04-02 23:11:19 +02:00
David Williams
d41a7d2747 Removed redundant samplers. 2015-04-02 21:35:50 +02:00
David Williams
135aa96bdf Further fixes for move...() functions. 2015-04-01 23:34:57 +02:00
David Williams
056cae39b5 Fixed sampler move...() functions to work with Morton ordering. 2015-04-01 22:57:22 +02:00
David Williams
b518978cd6 Enlarged lookup tables to 256 elements. 2015-04-01 22:34:42 +02:00
David Williams
65f39e7b57 Made the values signed ints, as otherwise the casting was doing something strange on 64-bit systems. 2015-04-01 16:29:19 +02:00
David Williams
5d220c5d57 Added extra lookup tables to avoid the need to multiply y/z deltas by 2/4 each time. 2015-03-31 23:58:01 +02:00
David Williams
60612c5583 Implemented use of delta for the rest of the peek functions. 2015-03-31 19:55:22 +02:00
David Williams
afd0650230 Implemented peeking in positive x and negative x directions using Matt's delta lookup table. 2015-03-31 16:33:56 +02:00
David Williams
120b8e84cc Added position in chunk and pointer to current chunk data to sampler. 2015-03-30 23:33:51 +02:00
David Williams
d34c1d227c Merge branch 'develop' into feature/morton-encoding 2015-03-30 15:38:34 +02:00
David Williams
5847219331 Fixed bug with chunk timestamp not being updated. 2015-03-30 15:36:28 +02:00
David Williams
b415e5c5f3 calculateAmbientOcclusion() now works with both RawVolume and PagedVolume. 2015-03-30 11:44:25 +02:00
David Williams
d000616d3e Revert "Ambient occlusion test now uses RawVolume, as it need a fixed size volume to create a temporary array."
This reverts commit 396d1cfc599e6837cf38bc1a95e680e9721ea844.
2015-03-30 11:24:48 +02:00
David Williams
413bb95b1a Passing parameter as const ref. 2015-03-30 11:01:08 +02:00
David Williams
7f96005985 Commented out optimized path in sampler as it doesn't work now that we are using Morton ordering for the data in chunks. However, we can probably reinstate such a fast path if we give some thought as to how it should be done. 2015-03-29 09:58:28 +02:00