Mufasa
-
5/7/2008 12:19:21 PM
Newsletter list management feature requests
A couple of important (IMO) feature requirements that need added to the Newsletter module:
1. When a user unsubscribes from a list, they are deleted (DELETE FROM ...) from the database and a single field in the Newsletter table is incremented to note that a user was removed (not which one, just that *a* user was removed). This is a problem if that Newsletter subscription list is not the master mailing list. For example, we have different lists for different groups, that is also modified by other third-party tools. So, we just import the subscription list for each newsletter we send out. But, when unsubscriptions come in, we have no way of knowing *who* unsubscribed. The only way we can figure this now, is wait for awhile after the list has been sent out, then export everyone and compare the lists (which is not a very simple task--have to calculate who is not in the two sets, importing and exporting 30000 names often times out, so we have to split the list up, etc.) to find missing names. Then we miss any users that unsubscribe later, unless we re-calculate missing names and then further compare it to the ones that were already removed. Obviously, this process could be improved a lot if you just marked entries as unsubscribed (w/ timestamp), which could then be spit out of the system very easily.
2. The paging on the various Newsletter subscriber lists isn't very good. I think step one to improve it would be to list "Page X of XX" and have a "XXX records found" or something like that. Also, add "First page" and "Last page" links. (Of course, you will want to run a COUNT(*) query first, so that the DataSets don't have to iterate through the entire list for every page view.)
3. Importing and exporting large lists often cause HTTP/server time outs. I know the timeouts are there for a reason, so I think the best solution would be to provide an import and export of XLS formatted data. You could also have an option (or do it for everything and not offer it is an option--whichever you think would work best) to have the import/export processing run as a background task via your Scheduler so the page response wouldn't have to wait on the list processing. (Maybe even AJAX interface w/ progress indicator for the user while they're waiting?) (Also, I haven't looked at the SQL directly, but you are doing batch INSERTS and not individual SQL statements during import, correct? If not, you should switch over to that--makes a big difference.)
4. Import should not fail for *all* addresses if there is one or more invalid addresses in the list. Obviously, erroneous addresses should be rejected and warned of, but the rest of the addresses should still get imported. Or, if you think that some users may want it to be a transactional process, simply add a toggle "Do not import any addresses if any fail" or something like that (I'm not good at succinct, accurate warnings or labels to show the user, but you get the idea) so those cases where bad addresses don't matter and can be ignored, don't have to be removed from the import list one by one.
|