Wenbin Lu
|
a8a40d2afd
|
feature: support SVM heap in reserveVirtualMem
Related-To: NEO-11981
Signed-off-by: Wenbin Lu <wenbin.lu@intel.com>
|
2024-10-22 16:47:14 +02:00 |
|
Compute-Runtime-Validation
|
41df1a6f47
|
Revert "feature: support SVM heap in reserveVirtualMem"
This reverts commit bfaeeb01d6.
Signed-off-by: Compute-Runtime-Validation <compute-runtime-validation@intel.com>
|
2024-10-03 14:53:50 +02:00 |
|
Wenbin Lu
|
bfaeeb01d6
|
feature: support SVM heap in reserveVirtualMem
Related-To: NEO-11981
Signed-off-by: Wenbin Lu <wenbin.lu@intel.com>
|
2024-09-09 23:22:04 +02:00 |
|
Jaroslaw Chodor
|
996dd18b76
|
WSL - partition layout for 48-bit limited addressing
Signed-off-by: Jaroslaw Chodor <jaroslaw.chodor@intel.com>
|
2021-09-28 18:09:46 +02:00 |
|
Igor Venevtsev
|
0bc5e158e5
|
Pass preferred base address to OSMemory::reserveCpuAddressRange()
Related-To: NEO-4525
Change-Id: I6d97ae41af1a0fba31993683bfc669f79aa5b77b
Signed-off-by: Igor Venevtsev <igor.venevtsev@intel.com>
|
2020-05-28 11:39:27 +02:00 |
|
Igor Venevtsev
|
2ac968e6c2
|
Add alignment capability do OSMemory::reserveCpuAddressRange
Resolves: NEO-4510
Change-Id: Iffcb33d1c06ca930345df0216bc5d3d1ce12c313
Signed-off-by: Igor Venevtsev <igor.venevtsev@intel.com>
|
2020-04-01 07:46:40 +02:00 |
|
Mateusz Jablonski
|
7df9945ebe
|
Add absolute include paths
Change-Id: I67a6919bbbff1d30c7d6cdb257b41c87bad51e7f
Signed-off-by: Mateusz Jablonski <mateusz.jablonski@intel.com>
|
2020-02-23 23:49:12 +01:00 |
|
kamdiedrich
|
e072275ae6
|
Reorganization directory structure [3/n]
Change-Id: If3dfa3f6007f8810a6a1ae1a4f0c7da38544648d
|
2020-02-23 23:48:28 +01:00 |
|