1) The hard work here is the import of the CSV - handling the various formats, with and without quotes requires a bit of thought. In particular, do not use explode(",", $sLine) to break up a line into columns as it will not work if any of the columns contain embedded commas. Either parse the line yourself or try the various php functions e.g. str_getcsv().
2) Where you import the data to is somewhat secondary. The choices are:
a) Into a table as you suggest
b) Into an array. [BTW, php handles multi dimension arrays as arrays of arrays so in this option, the array will be an array of 'lines' and each 'line' will be an array of columns.]
Which is best? I think it depends on the number of rows. If you import into an array you save the INSERT time, but for each array element (line) you have to do a SQL lookup. If you import into a table then you pay the INSERT cost, but only 1 sql query is needed to delete the backlist (and 1 delete matching N rows is way faster than N lookups).
There are MANY php array functions and I'm sure you could find one that would let you import the csv into an array, then import the blacklist into an array and then 'subtract' the two. Such an approach would need lots of comments IMHO whereas the sql approach is much more self-documenting.
If you want to re-display the data to a human and maybe allow them to 'tweak' which rows are deleted etc, then the SQL approach wins hands-down as it can preserve the data across page calls.
E.g. instead of the DELETE query you could have a 'delete_me' column in your data and on the first pass you set to 'Y' according to the blacklist, (update temptable where id in blacklisttable). Then you display the results to a human. They can tick or untick rows. Then press delete and you do "delete from temptable where delete_me = 'Y' "
Remember Rule 1: Make it work before you make it work fast. And Rule 2: If it works, but no one can understand your code, then it doesn't work.
Hope that helps.