Arts & Humanities
“Eclipse and Revelation” by Professor Henrike Lange now available as an audio book
Henrike Lange’s recent book Eclipse and Revelation: Total Solar Eclipses in Science, History, Literature and the Arts has been released as an audio book read by Professor Chris Hallett.
Professor
Winter Reads 2024
Cozy up this winter with these great reads from our library. These novels range from fantasy to mystery to romance and are perfect for cold weather. Check out UCB Overdrive for more!
New book by Jeroen Dewulf
Nova História do Cristianismo Negro na África Ocidental e nas Américas makes a historiographical intervention aimed at the history of black Catholicism and black religion in the Americas in a broader way. Dewulf’s central and well-documented assertion is that black Christianity, both Catholic and Protestant, has roots in pre-Tridentine Portuguese Catholicism. Even before the advent of the slave trade, Catholicism had become an indigenous African religion, at times assuming pre-Tridentine and syncretic forms that have become irreconcilable for the Europeans of the post-Tridentine period. This argument has significant historiographical consequences; the long-standing confusion about the religiosity of the enslaved people is, at least in part, the result of assumptions that Africans knew little about Christianity before their enslavement. On the contrary, Dewulf traces these religious forms to the slave ships that transported human “cargo” to the Americas. This book is a timely salute to the Catholic and Christian studies that has for a long time portrayed Christians of African descent as marginalized and atypical people, rather than important global actors. (Citation of the Committee of the Prize John Gilmary Shea of the year 2023)
[from publisher’s site]
Jeroen Dewulf is Queen Beatrix Professor in Dutch Studies at the UC Berkeley Department of German and a Professor at Berkeley’s Folklore Program and an affiliated member of the Center for African Studies and the Center for Latin American Studies. He recently completed his long-term role as director of UC Berkeley’s Institute of European Studies where he is chair of the Center for Portuguese Studies. His main area of research is Dutch and Portuguese colonial history, with a focus on the transatlantic slave trade and the culture and religion of African-descended people in the American diaspora. He also publishes in the field of Folklore Studies and about other aspects of Dutch, German, and Portuguese literature, culture, and history.
Nova História do Cristianismo Negro na África Ocidental e nas Américas. Porto Alegre: EDIPUCRS, 2024.
PhiloBiblon 2024 n. 6 (diciembre): Noticias
Con este post anunciamos el volcado de datos de BETA, BITAGAP y BITECA a PhiloBiblon (Universitat Pompeu Fabra). Este volcado de BETA y BITECA es el último. Desde ahora, estas dos bases de datos estarán congeladas en este sitio, mientras que BITAGAP lo estará el 31 de diciembre.
Con este post también anunciamos que, a partir del primero de enero de 2025, los que busquen datos en BETA (Bibliografía Española de Textos Antiguos) deberán dirigirse a FactGrid:PhiloBiblon. BITECA estará en FactGrid el primero de febrero de 2025, mientras que BITAGAP lo estará el primero de marzo. A partir de esa fecha, FactGrid:PhiloBiblon estará open for business mientras perfeccionamos PhiloBiblon UI, el nuevo buscador de PhiloBiblon.
Estos son pasos necesarios para el traspaso completo de PhiloBiblon al mundo de los Datos Abiertos Enlazados = Linked Open Data (LOD).
Este póster dinámico de Patricia García Sánchez-Migallon explica de manera sucinta y amena la historia técnica de PhiloBiblon, la configuración de LOD y el proceso que estamos siguiendo en el proyecto actual, “PhiloBiblon: From Siloed Databases to Linked Open Data via Wikibase”, con una ayuda de dos años (2023-2025) de la National Endowment for the Humanities:
Ésta es la versión en PDF del mismo póster: PhiloBiblon Project: Biobibliographic database of medieval and Renaissance romance texts.
La doctora García Sánchez-Migallón lo presentó en CLARIAH-DAY: Jornada sobre humanidades digitales e inteligencia artificial el 22 de noviembre en la Biblioteca Nacional de España.
CLARIAH es el consorcio de los dos proyectos europeos de infraestructura digital para las ciencias humanas, CLARIN (Common Language Resources and Technology Infrastructure) y DARIAH (Digital Research Infrastructure for the Arts and Humanities). Actualmente, la doctora García Sánchez-Migallón trabaja en la oficina de CLARIAH-CM de la Universidad Complutense de Madrid.
Charles B. Faulhaber
University of California, Berkeley
Native-American Heritage Month 2024
Get ready to dive into Native American Heritage Month with these must-read books! From epic legends to fresh voices, these stories celebrate the culture, history, and heart of Native communities. Check out more at UCB Overdrive.
A&H Data: Creating Mapping Layers from Historic Maps
Some of you know that I’m rather delighted by maps. I find them fascinating for many reasons, from their visual beauty to their use of the lie to impart truth, to some of their colors and onward. I think that maps are wonderful and great and superbulous even as I unhappily acknowledge that some are dastardly examples of horror.
What I’m writing about today is the process of taking a historical map (yay!) and pinning it on a contemporary street map in order to use it as a layer in programs like StoryMaps JS or ArcGIS, etc. To do that, I’m going to write about
Picking a Map from Wikimedia Commons
Wikimedia accounts and “map” markup
Warping the map image
Loading the warped map into ArcGIS Online as a layer
But! Before I get into my actual points for the day, I’m going to share one of my very favorite maps:
Just look at this beauty! It’s an azimuthal projection, centered on the North Pole (more on Wikipedia), from a 16th century Italian cartographer. For a little bit about map projections and what they mean, take a look at NASA’s example Map Projections Morph. Or, take a look at the above map in a short video from David Rumsey to watch it spin, as it was designed to.
What is Map Warping
While this is in fact one of my favorite maps and l use many an excuse to talk about it, I did actually bring it up for a reason: the projection (i.e., azimuthal) is almost impossible to warp.
As stated, warping a map is when one takes a historical map and pins it across a standard, contemporary “accurate” street map following a Mercator projection, usually for the purpose of analysis or use in a GIS program, etc.
Here, for example, is the 1913 Sanborn fire insurance map layered in ArcGIS Online maps.
I’ll be writing about how I did that below. For the moment, note how the Sanborn map is a bit pinched at the bottom and the borders are tilted. The original map wasn’t aligned precisely North and the process of pinning it (warping it) against an “accurate” street map resulted in the tilting.
That was possible in part because the Sanborn map, for all that they’re quite small and specific, was oriented along a Mercator projection, permitting a rather direct rectification (i.e., warping).
In contrast, take a look at what happens in most GIS programs if one rectifies a map—including my favorite above—which doesn’t follow a Mercator projection:
Warping a Mercator Map
This still leaves the question: How can one warp a map to begin with?
There are several programs that you can use to “rectify” a map. Among others, many people use QGIS (open access; Windows, macOS, Linux) or ArcGIS Pro (proprietary;Windows only).
Here, I’m going to use Wikimaps Warper (for more info), which connects up with Wikimedia Commons. I haven’t seen much documentation on the agreements and I don’t know what kind of server space the Wikimedia groups are working with, but recently Wikimedia Commons made some kind of agreement with Map Warper (open access, link here) and the resulting Wikimaps Warper is (as of the writing of this post in November 2024) in beta.
I personally think that the resulting access is one of the easiest to currently use.
And on to our steps!
Picking a Map from Wikimedia Commons
To warp a map, one has to have a map. At the moment, I recommend heading over to Wikimedia Commons (https://commons.wikimedia.org/) and selecting something relevant to your work.
Because I’m planning a multi-layered project with my 1950s publisher data, I searched for (san francisco 1950 map) in the search box. Wikimedia returned dozens of Sanborn Insurance Maps. At some point (22 December 2023) a previous user (Nowakki) had uploaded the San Francisco Sanborn maps from high resolution digital surrogates from the Library of Congress.
Looking through the relevant maps, I picked Plate 0000a (link) because it captured several areas of the city and not just a single block.
When looking at material on Wikimedia, it’s a good idea to verify your source. Most of us can upload material into Wikimedia Commons and the information provided on Wikimedia is not always precisely accurate. To verify that I’m working with something legitimately useful, I looked through the metadata and checked the original source (LOC). Here, for example, the Wikimedia map claims to be from 1950 and in the LOC, the original folder says its from 1913.
Feeling good about the legality of using the Sanborn map, I was annoyed about the date. Nonetheless, I decided to go for it.
Moving forward, I checked the quality. Because of how georecification and mapping software works, I wanted as high a quality of map as I could get so that it wouldn’t blur if I zoomed in.
If there wasn’t a relevant map in Wikimedia Commons already, I could upload a map myself (and likely will later). I’ll likely talk about uploading images into Wikimedia Commons in … a couple months maybe? I have so many plans! I find process and looking at steps for getting things done so fascinating.
Wikimedia Accounts and Tags
Before I can do much with my Sanborn map, I need to log in to Wikimedia Commons as a Wiki user. One can set up an account attached to one of one’s email accounts at no charge. I personally use my work email address.
Note: Wikimedia intentionally does not ask for much information about you and states that they are committed to user privacy. Their info pages (link) states that they will not share their users’ information.
I already had an account, so I logged straight in as “AccidentlyDigital” … because somehow I came up with that name when I created my account.
Once logged in, a few new options will appear on most image or text pages, offering me the opportunity to add or edit material.
Once I picked the Sanborn map, I checked
- Was the map already rectified?
- Was it tagged as a map?
If the specific map instance has already been rectified in Wikimaps, then there should be some information toward the end of the summary box that has a note about “Geotemporal data” and a linked blue bar at the bottom to “[v]iew the georeferenced map in the Wikimaps Warper.”
If that doesn’t exist, then one might get a summary box that is limited to a description, links, dates, etc., and no reference to georeferencing.
In consequence, I needed to click the “edit” link next to “Summary” above the description. Wikimedia will then load the edit box for only the summary section, which will appear with all the text from the public-facing box surrounded by standard wiki-language markup.
All I needed to do was change the “{{Information” to “{{Map” and then hit the “Publish” button toward the bottom of the edit box to release my changes.
The updated, public-facing view will now have a blue button offering to let users “Georeference the map in Wikimaps Warper.”
Once the button appeared, I clicked that lovely, large, blue button and went off to have some excellent fun (my version thereof).
Warping the map
When I clicked the “Georefence” button, Wikimedia sent me away to Wikimaps Warper (https://warper.wmflabs.org/). The Wikimaps interface showed me a thumbnail of my chosen map and offered to let me “add this map.”
I, delighted beyond measure, clicked the button and then went and got some tea. Depending on how many users are in the Wikimaps servers and how big the image file for the map is, adding the file into the Wikimaps servers can take between seconds and minutes. I have little patience for uploads and almost always want more tea, so the upload time is a great tea break.
Once the map loaded (I can get back to the file through Wikimedia Commons if I leave), I got an image of my chosen map with a series of options as tabs above the map.
Most of the tabs attempt to offer options for precisely what they say. The “Show” tab offers an image of the loaded map.
- Edit allows me to edit the metadata (i.e., title, cartographer, etc.) associated with the map.
- Rectify allows me to pin the map against a contemporary street map.
- Crop allows me to clip off edges and borders of the map that I might not want to appear in my work.
- Preview allows me to see where I’m at with the rectification process.
- Export provides download options and HTML links for exporting the rectified map into other programs.
- Trace would take me to another program with tracing options. I usually ignore the tab, but there are times when it’s wonderful.
The Sanborn map didn’t have any information I felt inclined to crop, so I clicked straight onto the “Rectify” tab and got to work.
As noted above, the process of rectification involves matching the historic map against a contemporary map. To start, one needs at least four pins matching locations on each map. Personally, I like to start with some major landmarks. For example, I started by finding Union Square and putting pins on the same location in both maps. Once I was happy with my pins’ placement on both maps, I clicked the “add control point” button below the two maps.
Once I had four pins, I clicked the gray “warp image!” button. The four points were hardly enough and my map curled badly around my points.
To straighten out the map, I went back in and pinned the four corners of the map against the contemporary map. I also pinned several street corners because I wanted the rectified map to be as precisely aligned as possible.
All said, I ended up with more than 40 pins (i.e., control points). As I went, I warped the image every few pins in order to save it and see where the image needed alignment.
As I added control points and warped my map, the pins shifted colors between greens, yellows, and reds with the occasional blue. The colors each demonstrated where the two maps were in exact alignment and where they were being pinched and, well, warped, to match.
Loading the warped map into ArcGIS Online as a layer
Once I was happy with the Sanborn image rectified against the OpenStreetMap that Wikimaps draws in, I was ready to export my work.
In this instance, I eventfully want to have two historic maps for layers and two sets of publisher data (1910s and 1950s).
To work with multiple layers, I needed to move away from Google My Maps and toward a more complex GIS program. Because UC Berkeley has a subscription to ArcGIS Online, I headed there. If I hadn’t had access to that online program, I’d have gone to QGIS. For an access point to ArcGIS online or for more on tools and access points, head to the UC Berkeley Library Research Guide for GIS (https://guides.lib.berkeley.edu/gis/tools).
I’d already set up my ArcGIS Online (AGOL) account, so I jumped straight in at https://cal.maps.arcgis.com/ and then clicked on the “Map” button in the upper-left navigation bar.
On the Map screen, ArcGIS defaulted to a map of the United States in a Mercator projection. ArcGIS also had the “Layers” options opened in the left-hand tool bars.
Because I didn’t yet have any layers except for my basemap, ArcGIS’s only option in “Layers” was “Add.”
Clicking on the down arrow to the right of “Add,” I selected “Add layer from URL.”
In response, ArcGIS Online gave me a popup box with a space for a URL.
I flipped back to my Wikimaps screen and copied the “Tiles (Google/OSM scheme),” which in this case read https://warper.wmflabs.org/maps/tile/7258/{z}/{x}/{y}.png.
Flipping back to ArcGIS Online, I pasted the tile link into the URL text box and made sure that the auto-populating “Type” information about the layer was accurate. I then hit a series of next to assure ArcGIS Online that I really did want to use this map.
Warning: Because I used a link, the resulting layer is drawn from Wikimaps every time I load my ArcGIS project. That does mean that if I had a poor internet connection, the map might take a hot minute to load or fail entirely. On UC Berkeley campus, that likely won’t be too much of an issue. Elsewhere, it might be.
Once my image layer loaded, I made sure I was aligned with San Francisco, and I saved my map with a relevant title. Good practice means that I also include a map description with the citation information to the Sanborn map layer so that viewers will know where my information is coming from.
Once I’ve saved it, I can mess with share settings and begin offering colleagues and other publics the opportunity to see the lovely, rectified Sanborn map. I can also move toward adding additional layers.
Next Time
Next post, I plan to write about how I’m going to add my lovely 1955 publisher dataset on top of a totally different, 1950 San Francisco map as a new layer. Yay!
A&H Data: Designing Visualizations in Google Maps
This map shows the locations of the bookstores, printers, and publishers in San Francisco in 1955 according to Polk’s Directory (SFPL link). The map highlights the quantity thereof as well as their centrality in the downtown. That number combined with location suggests that publishing was a thriving industry.
Using my 1955 publishing dataset in Google My Maps (https://www.google.com/maps/d) I have linked the directory addresses of those business categories with a contemporary street map and used different colors to highlight the different types. The contemporary street map allows people to get a sense of how the old data compares to what they know (if anything) about the modern city.
My initial Google My Map, however, was a bit hard to see because of the lack of contrast between my points as well as how they blended in with the base map. One of the things that I like to keep in mind when working with digital tools is that I can often change things. Here, I’m going to poke at and modify my
- Base map
- Point colors
- Information panels
- Sharing settings
My goal in doing so is to make the information I want to understand for my research more visible. I want, for example, to be able to easily differentiate between the 1955 publishing and printing houses versus booksellers. Here, contrasting against the above, is the map from the last post:
Quick Reminder About the Initial Map
To map data with geographic coordinates, one needs to head to a GIS program (US.gov discussion of). In part because I didn’t yet have the latitude and longitude coordinates filled in, I headed over to Google My Maps. I wrote about this last post, so I shan’t go into much detail. Briefly, those steps included:
-
- Logging into Google My Maps (https://www.google.com/maps/d/)
- Clicking the “Create a New Map” button
- Uploading the data as a CSV sheet (or attaching a Google Sheet)
- Naming the Map something relevant
Now that I have the map, I want to make the initial conclusions within my work from a couple weeks ago stand out. To do that, I logged back into My Maps and opened up the saved “Bay Area Publishers 1955.”
Base Map
One of the reasons that Google can provide My Maps at no direct charge is because of their advertising revenue. To create an effective visual, I want to be able to identify what information I have without losing my data among all the ads.
To move in that direction, I head over to the My Map edit panel where there is a “Base map” option with a down arrow. Hitting that down arrow, I am presented with an option of nine different maps. What works for me at any given moment depends on the type of information I want my data paired with.
The default for Google Maps is a street map. That street map emphasizes business locations and roads in order to look for directions. Some of Google’s My Maps’ other options focus on geographic features, such as mountains or oceans. Because I’m interested in San Francisco publishing, I want a sense of the urban landscape and proximity. I don’t particularly need a map focused on ocean currents. What I do want is a street map with dimmer colors than Google’s standard base map so that my data layer is distinguishable from Google’s landmarks, stores, and other points of interest.
Nonetheless, when there are only nine maps available, I like to try them all. I love maps and enjoy seeing the different options, colors, and features, despite the fact that I already know these maps well.
The options that I’m actually considering are “Light Political” (option center left in the grid) “Mono City” (center of the grid) or “White Water” (bottom right). These base map options focus on that lighter-toned background I want, which allows my dataset points to stand clearly against them.
For me, “Light Political” is too pale. With white streets on light gray, the streets end up sinking into the background, losing some of the urban landscape that I’m interested in. The bright, light blue of the ocean also draws attention away from the city and toward the border, which is precisely what it wants to do as a political map.
I like “Mono City” better as it allows my points to pop against a pale background while the ocean doesn’t draw focus to the border.
Of these options, however, I’m going to go with the “White Water” street map. Here, the city is done up with various grays and oranges, warming the map in contrast to “Mono City.” The particular style also adds detail to some of the geographic landmarks, drawing attention to the city as a lived space. Consequently, even though the white water creeps me out a bit, this map gets closest to what I want in my research’s message. I also know that for this data set, I can arrange the map zoom to limit the amount of water displayed on the screen.
Point colors
Now that I’ve got my base map, I’m on to choosing point colors. I want them to reflect my main research interests, but I’ve also got to pick within the scope of the limited options that Google provides.
I head over to the Edit/Data pane in the My Maps interface. There, I can “Style” the dataset. Specifically, I can tell the GIS program to color my markers by the information in any one of my columns. I could have points all colored by year (here, 1955) or state (California), rendering them monochromatic. I could go by latitude or name and individually select a color for each point. If I did that, I’d run up against Google’s limited, 30-color palette and end up with lots of random point colors before Google defaulted to coloring the rest gray.
What I choose here is the types of business, which are listed under the column labeled “section.”
In that column, I have publishers, printers, and three different types of booksellers:
- Printers-Book and Commercial
- Publishers
- Books-Retail
- Books-Second Hand
- Books-Wholesale
To make these stand out nicely against my base map, I chose contrasting colors. After all, using contrasting colors can be an easy way to make one bit of information stand out against another.
In this situation, my chosen base map has quite a bit of light grays and oranges. Glancing at my handy color wheel, I can see purples are opposite the oranges. Looking at the purples in Google’s options, I choose a darker color to contrast the light map. That’s one down.
For the next, I want Publishers to compliment Printers but be a clearly separate category. To meet that goal, I picked a darker purply-blue shade.
Moving to Books-Retail, I want them to stand as a separate category from the Printers and Publishers. I want them to complement my purples and still stand out against the grays and oranges. To do that, I go for one of Google’s dark greens.
Looking at the last two categories, I don’t particularly care if people can immediately differentiate the second-hand or wholesale bookstores from the retail category. Having too many colors can also be distracting. To minimize clutter of message, I’m going to make all the bookstores the same color.
Pop-ups/ Information Dock
For this dataset, the pop-ups are not overly important. What matters for my argument here is the spread. Nonetheless, I want to be aware of what people will see if they click on my different data points.
[Citylights pop-up right]
In this shot, I have an example of what other people will see. Essentially, it’s all of the columns converted to a single-entry form. I can edit these if desired and—importantly—add things like latitude and longitude.
The easiest way to drop information from the pop-up is to delete the column from the data sheet and re-import the data.
Sharing
As I finish up my map, I need to decide whether I want to keep it private (the default) or share it. Some of my maps, I keep private because they’re lists of favorite restaurants or loosely planned vacations. For example, a sibling is planning on getting married in Cadiz in Spain, and I have a map tagging places I am considering for my travel itinerary.
Here, in contrast, I want friends and fellow interested parties to be able to see it and find it. To make sure that’s possible, I clicked on “Share” above my layers. On the pop-up (as figured here) I switched the toggles to allow “Anyone with this link [to] view” and “Let others search for and find this map on the internet.” The latter, in theory, will permit people searching for 1955 publishing data in San Francisco to find my beautiful, high-contrast map.
Important: This is also where I can find the link to share the published version of the map. If I pull the link from the top of my window, I’d share the editable version. Be aware, however, that the editable and public versions look a pinch different. As embedded at the top of this post, the published version will not allow the viewer to edit the material and will have the sidebar for showing my information, as opposed to the edit view’s pop-ups.
Next steps
To see how those institutions sit in the 1950s world, I am inclined to see how those plots align across a 1950s San Francisco map. To do that, I’d need to find an appropriate map and add a layer under my dataset. At this time, however, Google Maps does not allow me to add image and/or map layers. So, in two weeks I’ll write about importing image layers into Esri’s ArcGIS.
Digital Archives and the DH Working Group on Nov. 4
To my delight, I can now announce that the next Digital Humanities Working Group at UC Berkeley is November 4 at 1pm in Doe Library, Room 223.
For the workshop, we have two amazing speakers for lightning talks. They are:
Danny Benett, MA Student in Folklore, will discuss the Berkeley folklore archive which is making ~500,000 folklore items digitally accessible.
Adrienne Serra, Digital Projects Archivist at The Bancroft Library, will demo an interactive map in ArcGIS allowing users to explore digital collections about the Spanish and Mexican Land grants in California.
We hope to see you there! Do consider signing up (link) as we order pizza and like to have loose numbers.
The UC Berkeley Digital Humanities Working Group is a research community founded to facilitate interdisciplinary conversations in digital humanities and cultural analytics. It is a welcoming and supportive community for all things digital humanities.
The event is co-sponsored by the D-Lab and Data & Digital Scholarship Services.
Rare Photography Book Donations from Richard Sun: Part 3 of 3
Check out these photography books generously donated by Richard Sun. Inquire with the Art History/Classics Library to view them in 308F Doe Library. Please request in advance, as they may be located off site.
As Terras do Fim do Mundo Beyond Drifting Body
Migrant Narrow Distances Now that you are Mine
Rare Photography Book Donations from Richard Sun: Part 2 of 3
These rare books are part of a generous curated donation from Richard Sun. They may be viewed in the Art History/ Classics Library. Request them in advance as they may be stored off site.
Zaido The Earth is Only a Little Dust Under our Feet Night Calls
As it was Given to Me Dream Children Jamais je ne t’oublierai
The Hidden Mother Landing Lights Park Hello My Name Is