Here is the way I did it.
It is very simple mind you but it works.
I have a submit site page with a form for others [or myself] to submit pages to be reviewed for indexing. It submits these links to a mysql table.
I have a script with an SQL statement using left join to get only the items not added. On this script it shows up the links (you can have the links display in an input box so the reviewer can edit the link [to add a trailing slash or
http://www or what ever] ) with two check boxes, one for add and one for deny.
Deny deletes the row and add inserts it to the site table with upddate=0.
I have another script that when executed (and this could be done from a link on the first script) that then runs spider.php by exec(). It finds all the sites where upddate=0 and loops through them to run exec(). I put a limit of 10 on it just so I can have a little more control over it. [once you run spider.php for a site it updates the upddate to current timestamp on lock and unlock of the tables]
What I did may not be the best solution, but it got working quickly. I will make a better solution sometime in the next couple weeks.