Schema acting strangely. Are there restrictions to schema size or something else I'm missing?

In this schema I have roughly 350 columns. It’s starting to act strangely.
Syntax errors are being reported on columns that are not using any kind of formula.
Syntax errors are occurring on columns saying that a specific object is not present, when it clearly is.
Newly created columns are not maintaining changes made to title and properties.
Saving takes quite some time.
Previewing takes quite some time.
Attempting to download even 1 row can frequently cause a time out.

Thoughts?

More info.

I tried exporting the json and breaking it down into sections. For example, I have a bunch of values within area ‘i’ then area ‘ii’ and area ‘iii’ and so forth (areas inferred by the fact that the columns begin with ‘i’ or ‘ii’ or ‘iii’.)

When I make a schema with the contents of section ‘i’ and preview the data, I get no problems.


Same with ‘ii’…

And ‘iii’ works fine too…

When I combine the information even just for section 1-3, I start getting odd behavior:

Examples:
for column ii_28 I get the following: error: Field ‘iii_06’ not found
Column ‘ii_28’ does not reference ‘iii_06’, not to mention that 'iii_06 is in fact there.

Column ‘iii_21’ uses regex to get results from column ‘__iii_21b’, which does reference ‘iii_06’ (which again is actually there) and this error occurs again: error: Field ‘iii_06’ not found

Column ‘iii_29’ has an error: ‘this if iii_27 else ‘N/A’ end’

Again, these schemas run on their own (linked above) do not experience these errors. If I export the json data, then combine them, then import the new json set, having changed NOTHING it gives these errors.

Any help is appreciated.

I have had this same issue as well with a schema that has 369 fields and its causing a lot of trouble. I have 18 sets of records that should either be blank or filled out, depending on whether the record before it was filled out. When I get to record 18, Mockaroo provides an error that record17.lastName doesn’t exist.

It’s as if I need to create hidden fields for whether each record# is completed or blanked out at the beginning of the file so they impact fields towards the bottom of the file. Are there any hard limits on the number of fields within a schema, or on the organization (ie. nesting) of those data elements?

There aren’t any hard limits, but 300+ fields is simply more than the interface was designed for. So it’s more like a soft limit.

1 Like