The e-mail list membership had climbed to 100 in late 1996 and held roughly constant for a year. Membership climbed to 140 in Winter 1997-98, largely as a result of the key role of the e-mail list in getting information to residents about an important, rapidly evolving problem (potential flooding).
Of the Board of Directors (as of Spring 1997, rounded to 5%):
I was quite surprised by how many people had how much trouble with the instructions (I expected a moderate amount). If you read between the lines of the instructions on the mailing lists, you will find hints of the problems encounted. The biggest ones were that various people had problems
Since the turn-over rate on our list is so small,
we decided not to use an automated system to do adds and drops.
My experience with the various Majordomo systems is that they
can be perplexing in all but the simplest cases to all but
the most savvy and experienced of users.
I have been on mailing lists where the majority of messages
related to unsubscribing (no exaggeration):
(1) why won't the Majordomo unsubscribe me,
(2) shouting at the person about not bothering the whole list
with these messages,
(3) shouting at the people in (2) to be more polite to (1),
(4) flame wars about whether "unsubscribe" is hard to spell
(the most common user error),
and whether or not Majordomo should be able to handle
the various common misspellings,
and (5) ...
Since just one of these events can drive many members off the list,
I decided to handle subscription requests manually,
so that I could deal gently and intelligently with requests
from our less experienced/sophisticated users.
Note:
The examples published (in our newsletter and on the Web)
for subscribe/unsubscribe requests is consistent
with using Majordomo,
in case we decided to switch to using it.
I randomly picked a response and found 26 lines of headers, an 18-line signature block, >40 lines of included text and 2 lines of actual content.
Consequently, an e-mail exchange that is printed on four pages in a reduced font may represent less than half a page of content if it were to be merged and rewritten.
I have complained to several sites about being very slow because they are graphics-heavy. They responded by acknowledging that it was slow to start, but claimed that this was OK because once you had the images cached, navigating through the site was very fast because subsequent pages reused many of the same images and backgrounds.Their notion of how sites are visited contrasts starkly with how I actually visited their site (and most other sites). I typically find a site through a search engine, and although most of the sites I visit are not relevant to my immediate need, some are promising enough that I bookmark them. In a subsequent visit, I typically browse only a small portion of the site, sometimes just exploring, but more likely because I am checking them before going to a search engine. Thus, because I make multiple widely-separated short visits, caching of graphics does little to help me, whereas if I was to spend substantial time browsing the site in a single visit, caching would be a major win.
Aside: This is a best guesswork since your user community is likely to be very diverse and hard to characterize.Negative example: As part of my "day job", I visited the site of a vendor (who shall not be named) in search of information on one of their product families. I went down the link entitled "Products" and found the items in question. However, there was spotty technical data, with most of the page devoted to "puffery" (how great their company was). I sent a quick note to their webmaster expressing my annoyance. His response was that the technical data was under the link for "Press Releases" -- I had not even bothered to explore that link because at most similar sites, the press releases were puffery with little technical detail (a press release being something that you give a reporter to lightly edit and submit as a news article).
YourName, <A HREF="MAILTO:YourAddress?SUBJECT=PageIdentifier">YourAddress</A>Since the user may have seen only the one page in question, it is unrealistic to expect him to produce a description that distinguishes that page from all other pages that you are responsible for. Do not expect them to type in the URL:
<A HREF="MAILTO:Person1@Your.Org, Person2@Your.Org?SUBJECT=PageId"> (test)
<A HREF="MAILTO:Person1@Your.Org?SUBJECT=PageId&CC=Person2@Your.Org, Person3@Your.Org"> (test)
( link:"www.cyberstars.com/bp" or link:"www.pgc.com/bpa" or link:"www2.bpaonline.org/bpa" )Explanation: our pages are spread over two hosts: www.cyberstars.com and www2.bpaonline.org, but both hosts also serve a variety of other home pages, so we have specified the link down to the top-level directories for these pages. www.pgc.com/bpa was the previous host for our site: Some other sites are very slow to update their links (even after we have notified them). Even after they have updated those pages, there can be a significant delay in their being visited by the various search engines. Note: some of the search engines have an easy mechanism for requesting a visit/re-visit to a page, and this request typically can be made by anyone, not just the owner of the page.
and not ( host:"www.cyberstars.com" or host:"www2.bpaonline.org" )
#!/bin/sh # Intended to be used from FIND command (and it from CRONTAB) # Arg 1: directory # Action: put "empty" index.html files in directly that do not have one. # Purpose: keep net crawlers and other indexing system from looking # into directories via the automatically created index file. # This is an issue for # - revision control subdirectories (e.g., RCS): you probably don't want # your obsolete text (and any comments in the change logs) # searchable from these indices. # - draft files in subdirectories. PATH=/usr/bin export PATH if [ -f $1/index.html -o -f $1/index.shtml -o -f $1/welcome.html -o -f $1/welcome.shtml ] ; then exit ; fi echo 'Creating empty (blocking) index.html file in' $1 echo 'no index' > $1/index.html
In one case, one of our Web authors created a basic page for a committee and added a substantial set of Other Resources links. The committee vetoed the page, say that the Web was not what they were about.
We have been encouraging the city and its contractors to provide
us with electronic versions of their announcements and maps.
There have been limited successes, but there is a long way to go.
We have also attempted to get the contractor to provide detailed
schedules of when various streets will be block or partially blocked
by their construction so that we can post it on the Web
(or provide a link to a page on their Web site).
No success here yet.
One contractor was willing, but updating the information kept
"falling between the cracks"
We used these Web pages as part of our briefing of the press, and the URL of the toplevel page was given in newspaper articles as where people could go to get more details.
We had this listed under Hot Topics link on our Home Page and, during the key period, a link appeared in the "banner box" on our Home Page itself.
However, collaborative authoring through e-mail has many problems when you apply it to a composite document, such as a newsletter composed of articles from multiple people.