I am trying to create a database that has a list of land lots. The values are typically "2A" or "10.0A" (where A is acres). The user enters the min and/or the max value of what they are looking for. For example, I might want a lot greater than 2A and less than 10A. For the most part the code works, if I do a min of 2A I get entries greater than 2. However, if I do anything above 9, it gets weird, returning 5A and 2A, even if I use "10.0A".
Here is my SQL statement:
select street_number,street_name,city,zip from sf_records2 where 1 and site_area>='10.0A' and
site_area != 'IRR' and site_area REGEXP 'A$' order by street_name,street_number limit 0,250;
Returns (for example):
If I use '9.1A', it works (for the most part). Example returns:
And just for fun, '15A'
So anything '10' and above is just weird, decimal point or not.
Notes: 1. The RegExp 'A$' is because there are also values in there for 'SqFt'.
2. If I were generating the data, I would split the A off before it went in the database or,
ideally remove it altogether, but I am stuck with data from several sources that come in this
Any ideas or thoughts are appreciated.