2014-11-14T09:17:08

New features: table edition and manual addition of articles

There are 2 new exciting features in BrainSpell: the edition of coordinate tables and the possibility of manually adding new articles to BrainSpell’s database.

Previously, articles and coordinate tables strictly reflected those parsed by NeuroSynth, and it was only possible to flag them as correct or incorrect. Now, if a table has incorrect or missing data, you can fix it! Once you are logged all the table fields become editable (just click on them), and the tables display -/+ icons beside each row to delete an incorrect row or add a missing one (see Fig. 1). In addition, the ‘Title’ and ‘Caption’ fields of each table are also editable.

Edition of table coordinates

Fig. 1. Edition of table coordinates

It is also possible now to manually add articles to BrainSpell’s database. To add a new article, find in in PubMed and copy it’s PubMed ID. For example, we will add the article by  Shepherd et al (2014), PubMed ID 25268788. Now, enter the url http://brainspell.org/article/ followed by the PubMed ID of your article; in our case: http://brainspell.org/article/25268788. When BrainSpell realises that the article is not in the database, it will pull its metadata (authors, title, abstract, tags, etc) from PubMed (Fig .2).

New article

Fig. 2. BrainSpell proposes to add manually an article if it’s not already in the database

If you are logged in, it will ask you whether you want to create a new entry by adding the number of stereotaxic coordinate tables present in the article, in our case 2 (Fig. 3).

fig03

Fig. 3. Adding 2 stereotaxic tables to the new article

The experiments have empty titles and captions, and a coordinate at X=0, Y=0 and Z=0 is present by default, with its corresponding sphere in the translucent brain (Fig. 4).

fig04

Fig. 4. An empty experiment. You can edit the table’s title, caption and add stereotaxic coordinates to it.

You can edit all these fields, titles, captions and coordinates, to enter the complete table as in Fig. 5.

fig05

Fig. 5. The same experiment after filling all the data

The functionalities are still very new, and may be bugs lurking around. Please let us know if you find anything, using the ‘issues’ tracker in github at http://github.com/r03ert0/brainspell.

Google+TwitterFacebookPinterestTumblr
2014-07-29T09:27:14

New feature: Coordinate selection

brainspell-pick

Coordinate selection: click on a sphere (or a table row) to highlight the corresponding table row (or sphere).

Now in the translucent brains showing the stereotaxic coordinates for the experiments in a paper you can click on a red sphere to highlight the corresponding coordinate row in the table, or vice versa, click on a row to highlight the corresponding sphere.

The code I used is based on the tutorial published by Soledad Panés in her excellent blog: http://soledadpenades.com/articles/three-js-tutorials/object-picking.

Google+TwitterFacebookPinterestTumblr
2014-07-14T06:47:04

Thank you Max!

The 29 articles responding to the search for "Love" in BrainSpell the 14 of July 2014. Thank you guys for your help!

The 29 articles responding to the search for “Love” in BrainSpell the 14 of July 2014. Thank you guys for your help!

Daniel Margulies from the Neuroanatomy and Connectivity group at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig is organising a Pizza-BrainSpell-Tagging-Sprint! Thank you very much!

When tagging I often focus first on the methodological aspects – check if the tables are correct, find the stereotaxic space, the number of subjects – and then I go for the ontologies. Sometimes it is possible to vote for the MeSH tags (those added by PubMed) just from reading the abstract, but to tag the experiments you’ll have to read the articles’s full text. If you don’t have much time, you can just tag using BrainMap’s ontology which is very small, but Cognitive Atlas’s ontologies are more detailed and can also be edited (at cognitiveatlas.org).

We’ll be also tagging articles at the Institut Pasteur. In case any one needs help or has suggestions for improvement, I have opened this chat room that I’ll be checking often:

http://tlk.io/brainspell

Google+TwitterFacebookPinterestTumblr
2014-07-12T05:10:56

Now you can retract a vote

retract-button

Previously if you have agreed with a tag you could only change your mind to disagree with it. Now it is also possible to retract your vote. If the tag was in an experiment ontology, and if your vote was the only one, retraction will remove the tag from the tag list.

Google+TwitterFacebookPinterestTumblr
2013-12-31T16:14:18

Brain viewer, more data fields, and discussion section

I have just uploaded a new update of brainspell. There are several new things:

  1. Now when you search for a term, a simple stereotaxic brain viewer shows the locations corresponding to your search. The first time you use it the viewer may take some time to load the brain anatomy used for reference (Colin27), but afterwards it will be stored locally in your browser’s cache and be much faster.
  2. I added more fields for gathering data about the articles in the database. There’s one field for entering the stereotaxic space (either MNI or Talairach), and another one to enter the total number of subjects used in the article (this information is quite hard to get using automatic text-mining alone…). There is also a field bellow each table to flag incorrectly parsed data. This may be a table with repeated coordinates, or displaying numbers which are actually not stereotaxic coordinates.
  3. Finally, at the end of each paper there is a new Discussion section, where you can better explain your choices, our discuss someone else’s tags.

I recently discovered how to make screencasts, and made these 2 youtube videos, one showing the brainspell’s GUI (a previous version… I’ll have to record a new one), and the other showing the Brain Coactivation Map Viewer. If you like brainspell, you may want to check the CoactivationMap.app. It uses all the same data of brainspell (kindly provided by Tal Yarkoni’s excellent neurosynth web app) to create an interactive coactivation map of the human brain. As you browse through the different brain regions, you will see the corresponding coactivation networks, and a list of the MeSH tags whose associated network most closely match the current one. Clicking on those tags will launch a query in brainspell showing the corresponding articles. Here are the videos:

Quick intro to brainspell

Quick intro to the CoactivationMap.app

Google+TwitterFacebookPinterestTumblr
2013-07-04T16:18:58

Full text links

Now articles present a link to the full text version in the journal’s website (Additionally, several bugs have been corrected – from the many that may still lurk around)

Google+TwitterFacebookPinterestTumblr
2013-06-04T16:20:13

Many first things today!

First functional version of brainspell is now on-line, taking queries, user registrations, and tags!

First version of the database is also available for download.

First round of beta testing (with mostly “sandbox” tagging).

In summary, the first day of our open, human curated, classification of the neuroimaging literature!

Google+TwitterFacebookPinterestTumblr