Commit 7e3ad167 by Alexis Hetu Committed by Alexis Hétu

Increased maximum token size

dEQP tests up to 1024 for token length. Fixes dEQP-GLES3.functional.shaders.uniform_block.valid.long_* Change-Id: Iba2a79fc210e58e5681dd15a3cece3f8129d4d32 Reviewed-on: https://swiftshader-review.googlesource.com/14229Tested-by: 's avatarAlexis Hétu <sugoi@google.com> Reviewed-by: 's avatarNicolas Capens <nicolascapens@google.com>
parent 3eb573f0
......@@ -2367,7 +2367,7 @@ namespace pp {
// TODO(alokp): Maximum token length should ideally be specified by
// the preprocessor client, i.e., the compiler.
const size_t Tokenizer::kMaxTokenLength = 256;
const size_t Tokenizer::kMaxTokenLength = 1024;
Tokenizer::Tokenizer(Diagnostics* diagnostics) : mHandle(0)
{
......
......@@ -284,7 +284,7 @@ namespace pp {
// TODO(alokp): Maximum token length should ideally be specified by
// the preprocessor client, i.e., the compiler.
const size_t Tokenizer::kMaxTokenLength = 256;
const size_t Tokenizer::kMaxTokenLength = 1024;
Tokenizer::Tokenizer(Diagnostics* diagnostics) : mHandle(0)
{
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment